SkillupLogo.jpg

Alumna: Erika Samara Alvares Angelim

Web Personal: https://www.erikaalvares.es/

Github: https://github.com/ea-analisisdatos/Programa_Inteligencia_Artificial

Detección de fraudes con tarjetas de crédito¶

Enlace al dataset: https://www.kaggle.com/datasets/mlg-ulb/creditcardfraud

Índice¶

  • Detección de fraudes con tarjetas de crédito
  • Índice
  • Introducción
    • Objetivo del Proyecto
    • Descripción del Problema
  • Metodología
  • Configuracion del Ambiente
    • Instalar las bibliotecas necesarias
    • Importar las bibliotecas necesarias
    • Configurar el entorno
    • Configurar Drive
    • Importar y organizar el dataset
  • Exploración de Datos
    • Preguntas de Negocio
    • Análisis de los datos
  • Visualización Inicial de Datos
    • Preguntas de Negocio
  • Preprocesamiento
    • Limpieza de Datos
      • Distribución de la variable "Class" antes del proceso de limpieza
      • Valores perdidos
      • Datos duplicados
      • Corrección de Sesgos (Skewness) en Características Numéricas
      • Detección de anomalías (método: Isolation Forest)
      • Regenerar gráficas con datos limpios
      • Eliminación de columnas irrelevantes
      • Escalado de Características
  • Evaluación Comparativa de Modelos
    • Separa el dataset
    • Técnicas
      • Técnica 1: Entrenamiento sobre el Dataset Original
      • Técnica 2: Balanceado con SMOTE
      • Técnica 3: Balanceado con RandomUnderSampler
      • Técnica 4: Validación Cruzada de Mejores Modelos (Balanceado con SMOTE)
      • Técnica 5: Validación Cruzada de Mejores Modelos (dataset original)
      • Técnica 6: Ensembles con SMOTE
      • Técnica 7: Ensembles con dataset original (clean_data)
    • Tabla Resumen de Resultados
    • Evaluar el impacto del tiempo de procesamiento con SMOTE
  • Mejores Modelos con AUPRC > 80%
    • Identificar las Métricas Candidatas al Mejor Modelo
      • Gráfico de Dispersión (AUPRC vs Recall)
      • Comparación de Métricas Clave
      • Tabla Resumen de la Tasa de Falsos Negativos (FNR)
      • Comparación de la Tasa de Falsos Negativos (FNR)
      • FNR vs Recall
    • Resultados y Análisis
  • Selección de Algoritmos/Modelos para ponerlo en producción
    • Función para Entrenamiento, Evaluación y Detección de Sobreajuste (Overfitting)
      • Validación Cruzada con cross_val_score (ADASYN)
      • Validación Cruzada con cross_val_score (con SMOTE)
      • Validación Cruzada con cross_val_score (SIN SMOTE)
      • Optimización de Hiperparámetros con GridSearchCV (con SMOTE)
      • Optimización de Hiperparámetros con Optuna (con SMOTE)
    • Consolidación de las métricas y los hiperparámetros de los modelos
    • Seleccionar el Modelo Ganador
    • Cargar el Modelo Más Reciente
    • Realizar Predicciones
  • Informe de Clasificación y Métricas
  • Aplicación para probar el modelo con datos nuevos
    • Generar datos simulados en un archivo CSV
    • Predicción interactiva: Importar CSV y predecir nuevos datos
    • Visualizaciones
  • Conclusiones
  • Código Completo para Generar el PDF Dinámico
  • Exportar cuaderno a formato HTML y a formato PDF
    • Descarga el archivo generado directamente desde Colab

Introducción¶

Visión general del proyecto: El proyecto está diseñado para dotar a los participantes de las habilidades y conocimientos necesarios para utilizar eficazmente la IA en la detección de actividades fraudulentas en las transacciones con tarjetas de crédito.

🎯 Objetivo del Proyecto¶

  1. Explorar el conjunto de datos
    Realizar un análisis exploratorio de datos (EDA) para comprender la estructura del conjunto de datos, identificar las características principales y detectar patrones iniciales en las variables.

  2. Visualizar diferencias entre clases
    Generar gráficos comparativos para identificar diferencias clave entre transacciones fraudulentas y no fraudulentas, enfocándose en distribuciones, valores atípicos y comportamientos específicos.

  3. Corregir el desequilibrio de clases
    Aplicar técnicas de balanceo de datos para tratar el marcado desequilibrio de clases, asegurando que los modelos puedan aprender de manera efectiva y mejorar su capacidad de detección de fraudes.

  4. Desarrollar un modelo predictivo
    Entrenar y evaluar modelos de Machine Learning con el objetivo de detectar fraudes en transacciones con alta precisión, priorizando un rendimiento robusto.

  5. Minimizar falsos positivos y negativos
    Optimizar los modelos para reducir la tasa de falsos positivos (detectar fraudes donde no existen) y falsos negativos (no detectar fraudes reales), mitigando riesgos y costos asociados.

  6. Seleccionar el mejor modelo
    Comparar el desempeño de los modelos utilizando métricas clave como AUPRC, Recall y FNR (Tasa de Falsos Negativos), para seleccionar el modelo con el mejor equilibrio entre precisión y capacidad de generalización.

  7. Proponer soluciones prácticas
    Ofrecer un modelo implementable que permita la detección automática de fraudes, contribuyendo a la toma de decisiones informadas y a la reducción de pérdidas económicas.

Descripción del Problema¶

En la detección de fraudes con tarjetas de crédito, el principal desafío para las compañías es proteger a los clientes de cargos no autorizados y reducir pérdidas económicas 💸.

📊 Descripción del Conjunto de Datos

  • Este conjunto de datos incluye transacciones anónimas realizadas en Europa en septiembre de 2013, etiquetadas como fraudulentas o genuinas (No-Fraudulentas).
  • Dado que los fraudes representan solo un 0.172% del total de transacciones (492 de 284,807), el conjunto de datos está altamente desbalanceado, lo cual presenta un gran reto para los modelos de detección.

📋 Características de los Datos

  • Componentes principales (V1, V2, … V28): Transformados usando PCA para preservar la confidencialidad.
  • ⏰ Tiempo (Time): Segundos transcurridos desde la primera transacción.
  • 💵 Cantidad (Amount): Monto (importe) de la transacción, útil para aprendizaje sensible al costo.
  • 🎯 Clase (Class): Variable objetivo, donde 1 indica fraude y 0 indica transacción genuina.

🚀 Desafíos y Recomendaciones para abordar el problema:

  1. Se recomienda usar métricas especializadas como el Área Bajo la Curva de precisión-recuperación (Precision-Recall: AUPRC/AUC-PR), ya que la precisión de la matriz de confusión no es significativa en conjuntos de datos desequilibrados.
    La métrica UPRC/AUC-PR es usada en escenarios como la detección de fraudes. El Recall es crucial porque queremos minimizar los casos en los que no detectamos un fraude (Falsos Negativos).
    Un Precision alto también es importante para evitar clasificar transacciones legítimas como fraudulentas (Falsos Positivos).

  2. Los modelos deben ser capaces de diferenciar eficazmente entre clases minoritarias y mayoritarias, asegurando una detección fiable sin comprometer la precisión en casos de transacciones genuinas.
    Este enfoque permite una detección efectiva de fraudes, protegiendo tanto a los consumidores como a las empresas de transacciones no autorizadas.

Metodología¶

Explicación de técnicas, modelos evaluados, y manejo del desbalance.

Tecnica: Organización y Propósito

En este proyecto, las etapas de preprocesamiento y entrenamiento se han estructurado en múltiples técnicas para evaluar de manera sistemática cómo diferentes métodos impactan el rendimiento de los modelos en la detección de fraudes. Este enfoque organizado permite comparar de manera objetiva los resultados obtenidos y seleccionar la combinación más efectiva de técnicas y algoritmos de machine learning.

El proceso se desarrolla en varias fases:

  1. Evaluación inicial de técnicas:

    • Se utilizan diversas técnicas de preprocesamiento, incluyendo balanceo de clases (Oversampling, Undersampling y SMOTE), y modelos.
    • Los resultados obtenidos se analizan para identificar los algoritmos y técnicas que producen el mejor desempeño según métricas clave, como AUPRC para datasets desbalanceados.
  2. Selección de los mejores modelos:

    • Con base en los resultados de las técnicas, se seleccionan los modelos que obtienen los mejores valores en las métricas evaluadas.
  3. Validación Cruzada:

    • Los modelos seleccionados se someten a una técnica adicional que implementa validación cruzada. Esto asegura que el rendimiento del modelo es consistente y no depende de divisiones específicas de los datos.
  4. Técnicas de Ensemble:

    • Se aplican métodos avanzados como VotingClassifier y StackingClassifier para combinar los modelos seleccionados.
    • VotingClassifier combina modelos mediante votación (mayoritaria o promedio ponderado), lo que refuerza la robustez del resultado final.
    • StackingClassifier utiliza un modelo meta para aprender a partir de las predicciones de los clasificadores base, mejorando la precisión.
  5. Modelo final y serialización:

    • Al finalizar todas las etapas, se identifica el modelo con el mejor rendimiento global.
    • Este modelo final se serializa en un archivo .pkl para su implementación en entornos de producción.

Este enfoque iterativo y estructurado garantiza que el modelo final no solo sea preciso, sino que también generalice bien en la detección de fraudes.

⚙️ Configuración del Ambiente e Importación de Datos¶

En esta sección, se importan las bibliotecas necesarias para trabajar en el proyecto. Configuramos el entorno para visualizar todas las columnas de la tabla sin que la información se corte. También habilitamos el acceso a Google Drive para la lectura del archivo CSV y, finalmente, importamos y organizamos el conjunto de datos.

✔️Instalar las bibliotecas necesarias¶

In [3]:
%%time
import sys
import subprocess
from importlib.metadata import version, PackageNotFoundError

# Lista de librerías requeridas
required_libraries = [
    'xgboost', 'catboost', 'lightgbm',
    'imbalanced-learn', 'scikit-learn', 'nbconvert',
    'notebook-as-pdf', 'PyPDF2==2.12.1', 'pyppeteer',
    'adjustText', 'mplcursors', 'joblib', 'optuna',
    'weasyprint', 'dask-expr', 'shap', 'nbconvert[webpdf] pyppeteer',
    'playwright'

]

# Lista de dependencias del sistema necesarias para Pandoc y exportación a PDF
system_dependencies = [
    'texlive-xetex', 'texlive-fonts-recommended', 'pandoc'
]

# Función para instalar dependencias del sistema en Google Colab
def install_system_dependencies(dependencies):
    try:
        print(("Instalando dependencias del sistema necesarias para "
        "Pandoc y nbconvert..."))
        subprocess.check_call(['apt-get', 'update'], stdout=subprocess.DEVNULL,
                              stderr=subprocess.DEVNULL)
        subprocess.check_call(['apt-get', 'install', '-y'] + dependencies,
                              stdout=subprocess.DEVNULL,
                              stderr=subprocess.DEVNULL)
        print("Instalación de dependencias del sistema completada.")
    except subprocess.CalledProcessError as e:
        print(f"Error instalando dependencias del sistema: {e}")
        sys.exit("No se pudieron instalar las dependencias del sistema. "\
         "Verifique los permisos o la configuración de su entorno.")


# Función para instalar librerías de Python faltantes
def install_libraries(libraries):
    for lib in libraries:
        try:
            print(f"Instalando {lib}...")
            subprocess.check_call([sys.executable, '-m', 'pip', 'install', lib])
        except Exception as e:
            print(f"Error instalando {lib}: {e}")

# Verificar qué librerías están faltando
missing_libraries = []
for lib in required_libraries:
    try:
        # Verificar si ya está instalada
        version(lib.split('==')[0])  # Separar si hay versión específica
        print(f"{lib} ya está instalado, versión: \
        {version(lib.split('==')[0])}")
    except PackageNotFoundError:
        print(f"{lib} no está instalado.")
        missing_libraries.append(lib)

# Resolver conflictos de versiones para urllib3 si pyppeteer está en la lista
if 'pyppeteer' in missing_libraries or 'pyppeteer' in required_libraries:
    try:
        print("Resolviendo conflictos con urllib3 para pyppeteer...")
        subprocess.check_call([sys.executable, '-m', 'pip', 'uninstall',
                               'urllib3', '-y'])
        subprocess.check_call([sys.executable, '-m', 'pip', 'install',
                               'urllib3<2.0.0'])
        print("Versión de urllib3 ajustada para compatibilidad con pyppeteer.")
    except Exception as e:
        print(f"Error ajustando urllib3: {e}")

# Instalar dependencias del sistema en Google Colab
install_system_dependencies(system_dependencies)

# Instalar librerías faltantes
if missing_libraries:
    install_libraries(missing_libraries)
else:
    print("Todas las librerías requeridas ya están instaladas.")

# Imprimir las versiones de las librerías instaladas
print("\nVersiones de las librerías instaladas:")
for lib in required_libraries:
    try:
        print(f"{lib}: {version(lib.split('==')[0])}")
    except PackageNotFoundError:
        print(f"{lib}: No instalado")

# Verificar todas las librerías instaladas
#print("\nListado completo de librerías instaladas:")
#!pip list
xgboost ya está instalado, versión:         2.1.3
catboost no está instalado.
lightgbm ya está instalado, versión:         4.5.0
imbalanced-learn ya está instalado, versión:         0.12.4
scikit-learn ya está instalado, versión:         1.6.0
nbconvert ya está instalado, versión:         7.16.4
notebook-as-pdf no está instalado.
PyPDF2==2.12.1 no está instalado.
pyppeteer no está instalado.
adjustText no está instalado.
mplcursors no está instalado.
joblib ya está instalado, versión:         1.4.2
optuna no está instalado.
weasyprint no está instalado.
dask-expr no está instalado.
shap ya está instalado, versión:         0.46.0
nbconvert[webpdf] pyppeteer no está instalado.
playwright ya está instalado, versión:         1.49.1
Resolviendo conflictos con urllib3 para pyppeteer...
Versión de urllib3 ajustada para compatibilidad con pyppeteer.
Instalando dependencias del sistema necesarias para Pandoc y nbconvert...
Instalación de dependencias del sistema completada.
Instalando catboost...
Instalando notebook-as-pdf...
Instalando PyPDF2==2.12.1...
Instalando pyppeteer...
Instalando adjustText...
Instalando mplcursors...
Instalando optuna...
Instalando weasyprint...
Instalando dask-expr...
Instalando nbconvert[webpdf] pyppeteer...
Error instalando nbconvert[webpdf] pyppeteer: Command '['/usr/bin/python3', '-m', 'pip', 'install', 'nbconvert[webpdf] pyppeteer']' returned non-zero exit status 1.

Versiones de las librerías instaladas:
xgboost: 2.1.3
catboost: 1.2.7
lightgbm: 4.5.0
imbalanced-learn: 0.12.4
scikit-learn: 1.6.0
nbconvert: 7.16.4
notebook-as-pdf: 0.5.0
PyPDF2==2.12.1: 2.12.1
pyppeteer: 2.0.0
adjustText: 1.3.0
mplcursors: 0.6
joblib: 1.4.2
optuna: 4.1.0
weasyprint: 63.1
dask-expr: 1.1.21
shap: 0.46.0
nbconvert[webpdf] pyppeteer: No instalado
playwright: 1.49.1
CPU times: user 764 ms, sys: 108 ms, total: 872 ms
Wall time: 2min 21s

✔️Importar las bibliotecas necesarias¶

In [4]:
# Librerías básicas: Manejo de datos, visualización.
import pandas as pd  # Manejo y manipulación de datos tabulares
import numpy as np  # Operaciones numéricas, matrices y arreglos
import matplotlib.pyplot as plt  # Creación de gráficos y visualizaciones
import seaborn as sns  # Visualización de datos estadísticos
from adjustText import adjust_text  # Evitar solapamiento en gráficos
import mplcursors  # Interacción con gráficos (hover)
import os

# Procesamiento de datos: Preprocesamiento y reducción de dimensiones.
from sklearn.preprocessing import RobustScaler  # Escalado robusto para datos con outliers
from sklearn.decomposition import PCA  # Reducción de dimensionalidad
from sklearn.feature_selection import mutual_info_classif  # Selección de características basada en información mutua

# Validación y división de datos: División en train/test y validación cruzada.
from sklearn.model_selection import (
    train_test_split,  # División en conjuntos de entrenamiento y prueba
    StratifiedKFold,  # Validación cruzada estratificada
    cross_val_score,  # Validación cruzada rápida
    GridSearchCV  # Búsqueda de hiperparámetros mediante cuadrícula
)

# Desbalance de clases: Técnicas de SMOTE, undersampling y combinaciones.
from imblearn.over_sampling import SMOTE  # Oversampling de clases minoritarias
from imblearn.combine import SMOTEENN  # Combinación de SMOTE y undersampling
from imblearn.under_sampling import RandomUnderSampler  # Undersampling de clases mayoritarias
from imblearn.pipeline import Pipeline

# Algoritmos: Modelos de Machine Learning.
from sklearn.ensemble import (
    RandomForestClassifier,  # Bosques aleatorios
    IsolationForest,  # Detección de anomalías
    VotingClassifier,  # Ensemble de votación
    StackingClassifier  # Ensemble de apilamiento
)
from sklearn.linear_model import LogisticRegression  # Regresión logística
from catboost import CatBoostClassifier  # CatBoost para clasificación
from xgboost import XGBClassifier  # XGBoost para clasificación
from lightgbm import LGBMClassifier  # LightGBM para clasificación
from sklearn.model_selection import StratifiedKFold

# Métricas de evaluación: Métricas específicas utilizadas en el proyecto.
from sklearn.metrics import (
    accuracy_score,  # Exactitud del modelo
    precision_score,  # Precisión
    recall_score,  # Sensibilidad o Recall
    f1_score,  # F1-Score: equilibrio entre precisión y recall
    roc_auc_score,  # Área bajo la curva ROC
    average_precision_score,  # Área bajo la curva Precision-Recall
    confusion_matrix,  # Matriz de confusión
    ConfusionMatrixDisplay,  # Visualización de la matriz de confusión
    precision_recall_curve,  # Curva Precision-Recall
    roc_curve,  # Curva ROC
    auc,  # Cálculo del área bajo la curva
    classification_report,  # Resumen completo de métricas
    balanced_accuracy_score,  # Precisión balanceada
    matthews_corrcoef  # Correlación de Matthews
)

# Optimización de hiperparámetros: Herramientas para afinar modelos.
import optuna  # Optimización bayesiana de hiperparámetros
from sklearn.model_selection import GridSearchCV  # Búsqueda exhaustiva de hiperparámetros
import joblib  # Serialización y deserialización de modelos entrenados
import shap

# Librerías adicionales: Estadísticas, selección de características, medición de tiempo.
from scipy.stats import shapiro  # Prueba de normalidad
import time  # Evaluar tiempo de procesamiento

# Estilización: Herramientas para mejorar la visualización en consola.
from termcolor import colored as style  # Colorear texto en la salida de la consola
import dask

✔️Configurar el entorno¶

In [5]:
import warnings
warnings.filterwarnings("ignore")

pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)
pd.options.display.float_format = '{:.8f}'.format
#pd.set_option('display.width', 1000)

# Ajustar el ancho del contenedor de Jupyter/Colab:
from IPython.display import display, HTML
display(HTML("<style>.container { width:100% !important; }</style>"))


plt.style.use('ggplot')

#plt.style.use('fivethirtyeight')
#plt.rcParams['figure.figsize'] = (10, 6)

✔️Configurar Drive¶

In [6]:
%%time
# Montar el drive para obtener el csv
from google.colab import drive

# Monta Google Drive
drive.mount('/content/drive')

# Acceder al archivo CSV proporcionando la ruta correcta desde tu Google Drive
file_path = '/content/drive/MyDrive/Curso_IA_SkillUp_IBM/creditcard.csv'
Mounted at /content/drive
CPU times: user 1.25 s, sys: 205 ms, total: 1.45 s
Wall time: 22.3 s

✔️Importar y organiza el dataset¶

In [ ]:
# Organizar los datos en un dataframe
data = pd.read_csv(file_path)
In [ ]:
print("Numero de Filas:", data.shape[0])
print("Numero de Columnas", data.shape[1])
Numero de Filas: 284807
Numero de Columnas 31
In [ ]:
# Identificando los nombre de todas las columnas
data.columns
Out[ ]:
Index(['Time', 'V1', 'V2', 'V3', 'V4', 'V5', 'V6', 'V7', 'V8', 'V9', 'V10',
       'V11', 'V12', 'V13', 'V14', 'V15', 'V16', 'V17', 'V18', 'V19', 'V20',
       'V21', 'V22', 'V23', 'V24', 'V25', 'V26', 'V27', 'V28', 'Amount',
       'Class'],
      dtype='object')
In [ ]:
# Información de los atributos
data.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 284807 entries, 0 to 284806
Data columns (total 31 columns):
 #   Column  Non-Null Count   Dtype  
---  ------  --------------   -----  
 0   Time    284807 non-null  float64
 1   V1      284807 non-null  float64
 2   V2      284807 non-null  float64
 3   V3      284807 non-null  float64
 4   V4      284807 non-null  float64
 5   V5      284807 non-null  float64
 6   V6      284807 non-null  float64
 7   V7      284807 non-null  float64
 8   V8      284807 non-null  float64
 9   V9      284807 non-null  float64
 10  V10     284807 non-null  float64
 11  V11     284807 non-null  float64
 12  V12     284807 non-null  float64
 13  V13     284807 non-null  float64
 14  V14     284807 non-null  float64
 15  V15     284807 non-null  float64
 16  V16     284807 non-null  float64
 17  V17     284807 non-null  float64
 18  V18     284807 non-null  float64
 19  V19     284807 non-null  float64
 20  V20     284807 non-null  float64
 21  V21     284807 non-null  float64
 22  V22     284807 non-null  float64
 23  V23     284807 non-null  float64
 24  V24     284807 non-null  float64
 25  V25     284807 non-null  float64
 26  V26     284807 non-null  float64
 27  V27     284807 non-null  float64
 28  V28     284807 non-null  float64
 29  Amount  284807 non-null  float64
 30  Class   284807 non-null  int64  
dtypes: float64(30), int64(1)
memory usage: 67.4 MB
In [ ]:
# Muestra las primeras 5 filas del dataframe
print("Muestra las primeras 5 filas del dataframe:")
print(data.head())
Muestra las primeras 5 filas del dataframe:
        Time          V1          V2         V3          V4          V5  \
0 0.00000000 -1.35980713 -0.07278117 2.53634674  1.37815522 -0.33832077   
1 0.00000000  1.19185711  0.26615071 0.16648011  0.44815408  0.06001765   
2 1.00000000 -1.35835406 -1.34016307 1.77320934  0.37977959 -0.50319813   
3 1.00000000 -0.96627171 -0.18522601 1.79299334 -0.86329128 -0.01030888   
4 2.00000000 -1.15823309  0.87773675 1.54871785  0.40303393 -0.40719338   

           V6          V7          V8          V9         V10         V11  \
0  0.46238778  0.23959855  0.09869790  0.36378697  0.09079417 -0.55159953   
1 -0.08236081 -0.07880298  0.08510165 -0.25542513 -0.16697441  1.61272666   
2  1.80049938  0.79146096  0.24767579 -1.51465432  0.20764287  0.62450146   
3  1.24720317  0.23760894  0.37743587 -1.38702406 -0.05495192 -0.22648726   
4  0.09592146  0.59294075 -0.27053268  0.81773931  0.75307443 -0.82284288   

          V12         V13         V14         V15         V16         V17  \
0 -0.61780086 -0.99138985 -0.31116935  1.46817697 -0.47040053  0.20797124   
1  1.06523531  0.48909502 -0.14377230  0.63555809  0.46391704 -0.11480466   
2  0.06608369  0.71729273 -0.16594592  2.34586495 -2.89008319  1.10996938   
3  0.17822823  0.50775687 -0.28792375 -0.63141812 -1.05964725 -0.68409279   
4  0.53819555  1.34585159 -1.11966983  0.17512113 -0.45144918 -0.23703324   

          V18         V19         V20         V21         V22         V23  \
0  0.02579058  0.40399296  0.25141210 -0.01830678  0.27783758 -0.11047391   
1 -0.18336127 -0.14578304 -0.06908314 -0.22577525 -0.63867195  0.10128802   
2 -0.12135931 -2.26185710  0.52497973  0.24799815  0.77167940  0.90941226   
3  1.96577500 -1.23262197 -0.20803778 -0.10830045  0.00527360 -0.19032052   
4 -0.03819479  0.80348692  0.40854236 -0.00943070  0.79827849 -0.13745808   

          V24         V25         V26         V27         V28       Amount  \
0  0.06692807  0.12853936 -0.18911484  0.13355838 -0.02105305 149.62000000   
1 -0.33984648  0.16717040  0.12589453 -0.00898310  0.01472417   2.69000000   
2 -0.68928096 -0.32764183 -0.13909657 -0.05535279 -0.05975184 378.66000000   
3 -1.17557533  0.64737603 -0.22192884  0.06272285  0.06145763 123.50000000   
4  0.14126698 -0.20600959  0.50229222  0.21942223  0.21515315  69.99000000   

   Class  
0      0  
1      0  
2      0  
3      0  
4      0  
In [ ]:
# Muestra las ultimas 5 filas del dataframe
print("Muestra las ultimas 5 filas del dataframe:")
print(data.tail())
Muestra las ultimas 5 filas del dataframe:
                  Time           V1          V2          V3          V4  \
284802 172786.00000000 -11.88111789 10.07178497 -9.83478346 -2.06665568   
284803 172787.00000000  -0.73278867 -0.05508049  2.03502975 -0.73858858   
284804 172788.00000000   1.91956501 -0.30125385 -3.24963981 -0.55782812   
284805 172788.00000000  -0.24044005  0.53048251  0.70251023  0.68979917   
284806 172792.00000000  -0.53341252 -0.18973334  0.70333737 -0.50627124   

                V5          V6          V7          V8         V9         V10  \
284802 -5.36447278 -2.60683733 -4.91821543  7.30533402 1.91442827  4.35617041   
284803  0.86822940  1.05841527  0.02432970  0.29486870 0.58480002 -0.97592606   
284804  2.63051512  3.03126010 -0.29682653  0.70841718 0.43245405 -0.48478176   
284805 -0.37796113  0.62370772 -0.68617999  0.67914546 0.39208671 -0.39912565   
284806 -0.01254568 -0.64961669  1.57700625 -0.41465041 0.48617951 -0.91542665   

               V11         V12         V13         V14         V15  \
284802 -1.59310526  2.71194079 -0.68925561  4.62694203 -0.92445871   
284803 -0.15018885  0.91580191  1.21475585 -0.67514296  1.16493091   
284804  0.41161374  0.06311886 -0.18369869 -0.51060184  1.32928351   
284805 -1.93384882 -0.96288614 -1.04208166  0.44962444  1.96256312   
284806 -1.04045834 -0.03151305 -0.18809290 -0.08431647  0.04133346   

               V16         V17         V18         V19        V20        V21  \
284802  1.10764060  1.99169111  0.51063233 -0.68291968 1.47582913 0.21345411   
284803 -0.71175735 -0.02569286 -1.22117886 -1.54555609 0.05961590 0.21420534   
284804  0.14071598  0.31350179  0.39565248 -0.57725184 0.00139597 0.23204504   
284805 -0.60857713  0.50992846  1.11398059  2.89784877 0.12743352 0.26524492   
284806 -0.30262009 -0.66037665  0.16742993 -0.25611687 0.38294810 0.26105733   

              V22         V23         V24         V25         V26         V27  \
284802 0.11186374  1.01447990 -0.50934845  1.43680691  0.25003428  0.94365117   
284803 0.92438358  0.01246304 -1.01622567 -0.60662399 -0.39525507  0.06847247   
284804 0.57822901 -0.03750086  0.64013388  0.26574545 -0.08737060  0.00445477   
284805 0.80004874 -0.16329794  0.12320524 -0.56915886  0.54666846  0.10882073   
284806 0.64307844  0.37677701  0.00879738 -0.47364870 -0.81826712 -0.00241531   

               V28       Amount  Class  
284802  0.82373096   0.77000000      0  
284803 -0.05352739  24.79000000      0  
284804 -0.02656083  67.88000000      0  
284805  0.10453282  10.00000000      0  
284806  0.01364891 217.00000000      0  

🕵️‍♀️ Exploración de Datos¶

En esta sección, se realiza un análisis exploratorio de los datos y se contesta algunas preguntas de negocio.

Preguntas de Negocio¶

🤔 Pregunta 1: ¿Cuál es el porcentaje de transacciones fraudulentas y no-fraudulentas en el dataset?

In [ ]:
print('El porcentaje de transacciones no-fraudulentas es de',
      round(data['Class'].value_counts()[0]/len(data) * 100,2),
      '% en el dataset')

print('El porcentaje de transacciones fraudulentas es de',
      round(data['Class'].value_counts()[1]/len(data) * 100,2),
      '% en el dataset')
El porcentaje de transacciones no-fraudulentas es de 99.83 % en el dataset
El porcentaje de transacciones fraudulentas es de 0.17 % en el dataset
In [ ]:
# Boxplot de Montos de Transacciones por Clase

'''
Esta gráfica complementa bien el análisis y puede servir como entrada para
modelos o reglas de detección de fraudes:

- Permite identificar patrones de comportamiento en los montos de transacciones
fraudulentas.
- Ayuda a entender si los montos fraudulentos son más altos o variables que los
no fraudulentos.
- Proporciona información práctica para definir umbrales de detección de
anomalías.
'''

# Separando los datos para el análisis
fraudulent_data = data[data['Class'] == 1]
non_fraudulent_data = data[data['Class'] == 0]

# Calcular conteos de clases
fraudulent_count = fraudulent_data.shape[0]
non_fraudulent_count = non_fraudulent_data.shape[0]

# Calcular el total de transacciones
total_transactions = fraudulent_count + non_fraudulent_count

# Calcular los porcentajes de transacciones
fraud_percentage = (fraudulent_count / total_transactions) * 100
non_fraud_percentage = (non_fraudulent_count / total_transactions) * 100

# Crear etiquetas con los porcentajes
class_labels = [
    f"Fraudulent ({fraud_percentage:.2f}%)",
    f"Non-Fraudulent ({non_fraud_percentage:.2f}%)"
]

# ---- Texto explicativo inicial ----
print("📊 **Gráfica: Distribución de Montos de Transacciones por Clase**\n")
print("**Objetivo:**")
print("Este gráfico permite comparar las distribuciones de los montos de transacciones entre clases fraudulentas y no fraudulentas.")
print("\n**¿Qué debemos observar?**")
print("1. La mediana (línea dentro de cada caja) indica el monto típico para cada clase.")
print("2. El tamaño de la caja muestra la dispersión (rango intercuartílico, IQR) de los montos.")
print("3. Los puntos fuera de la caja son **outliers** (valores atípicos), que son especialmente importantes en fraudes.")
print("4. Si la clase fraudulenta tiene montos significativamente más altos o más dispersos, esto puede ser un patrón clave para la detección de fraudes.\n\n")

# Crear Boxplot con colores personalizados
plt.figure(figsize=(10, 6))
palette_colors = {'0': '#3498DB', '1': '#FF5733'}  # Colores personalizados para clases
sns.boxplot(x='Class', y='Amount', data=data, palette=palette_colors)

# Configuraciones del gráfico
plt.title("Distribución de Montos de Transacciones por Clase", fontsize=14, fontweight='bold')
plt.xlabel("Clase", fontsize=12)
plt.ylabel("Monto de la Transacción", fontsize=12)
plt.xticks([0, 1], class_labels)  # Reemplazar etiquetas con porcentajes
plt.grid(axis='y', linestyle='--', alpha=0.5)

# Mostrar el gráfico
plt.tight_layout()
plt.show()

# ---- Texto explicativo final ----
print("\n🔍 **Interpretación del Gráfico:**")
print("1. Si la caja de las transacciones fraudulentas es más alta o más dispersa, sugiere que los fraudes tienden a tener montos mayores o más variados.")
print("2. Los **outliers** en la clase fraudulenta son especialmente importantes, ya que representan montos atípicos que podrían ser señales claras de fraude.")
print("3. Compara la posición de la mediana entre ambas clases para identificar diferencias significativas en los montos.")
📊 **Gráfica: Distribución de Montos de Transacciones por Clase**

**Objetivo:**
Este gráfico permite comparar las distribuciones de los montos de transacciones entre clases fraudulentas y no fraudulentas.

**¿Qué debemos observar?**
1. La mediana (línea dentro de cada caja) indica el monto típico para cada clase.
2. El tamaño de la caja muestra la dispersión (rango intercuartílico, IQR) de los montos.
3. Los puntos fuera de la caja son **outliers** (valores atípicos), que son especialmente importantes en fraudes.
4. Si la clase fraudulenta tiene montos significativamente más altos o más dispersos, esto puede ser un patrón clave para la detección de fraudes.


No description has been provided for this image
🔍 **Interpretación del Gráfico:**
1. Si la caja de las transacciones fraudulentas es más alta o más dispersa, sugiere que los fraudes tienden a tener montos mayores o más variados.
2. Los **outliers** en la clase fraudulenta son especialmente importantes, ya que representan montos atípicos que podrían ser señales claras de fraude.
3. Compara la posición de la mediana entre ambas clases para identificar diferencias significativas en los montos.

👁️ Observación:

  • Clases desbalanceadas:
    • Se obseva un gran desbalance entre las clases. El porcentaje de transacciones fraudulentas representa apenas un 0.17% de todo el dataset. Mas adelante en el proyecto se analizará técnicas como sobremuestreo o submuestreo.


🤔 Pregunta 2: ¿Cuál es el importe medio de las transacciones fraudulentas y no-fraudulentas?

In [ ]:
# Separando los datos para el análisis
fraudulent_data = data[data['Class'] == 1]
non_fraudulent_data = data[data['Class'] == 0]

# Calcula el importe medio de las transacciones fraudulentas
average_amount_fraudulent = fraudulent_data['Amount'].mean()

# Calcula el importe medio de las transacciones no-fraudulentas
average_amount_non_fraudulent = non_fraudulent_data['Amount'].mean()

print(f"El importe medio de las transacciones fraudulentas es: ",
round(average_amount_fraudulent,2))

print(f"El importe medio de las transacciones no-fraudulentas (legítimas) es: ",
round(average_amount_non_fraudulent,2))
El importe medio de las transacciones fraudulentas es:  122.21
El importe medio de las transacciones no-fraudulentas (legítimas) es:  88.29

👁️ Observación:

  • Importe medio de las transaciones:
    Se observa que el importe medio de las transacciones fraudulentas es significativamente mayor en comparación con las transacciones no fraudulentas, lo cual puede servir como referencia para establecer umbrales en la detección de anomalías.


Análisis de los datos¶

Resumen de las características del dataset.

In [ ]:
# Montos que mas se repiten en las transacciones fraudulentas
#fraudulent_data['Amount'].value_counts()

# Montos (>1) que más se repiten en las transacciones fraudulentas (2 decimales y
#ordenados)
fraudulent_data['Amount'] = fraudulent_data['Amount'].round(2)  # Redondear a 2 decimales

amount_counts = (
    fraudulent_data['Amount']
    .value_counts()
    .sort_values(ascending=False)
)  # Contar y ordenar

# Filtrar solo montos que se repiten más de una vez
repeated_amounts = amount_counts[amount_counts > 1].sort_values(ascending=False)

# Mostrar los resultados con formato de dos decimales
print("Montos (>1) que más se repiten en las transacciones fraudulentas:")
for amount, count in repeated_amounts.items():
    print(f"{amount:.2f}: {count}")
Montos (>1) que más se repiten en las transacciones fraudulentas:
1.00: 113
0.00: 27
99.99: 27
0.76: 17
0.77: 10
0.01: 5
2.00: 4
3.79: 4
1.10: 3
12.31: 3
2.28: 3
1.18: 3
0.68: 3
39.45: 2
30.31: 2
44.90: 2
94.82: 2
1.59: 2
1.63: 2
105.89: 2
101.50: 2
88.23: 2
45.51: 2
104.03: 2
19.02: 2
1.52: 2
78.00: 2
316.06: 2
7.59: 2
8.00: 2
18.96: 2
512.25: 2
0.83: 2
252.92: 2
723.21: 2
188.52: 2
111.70: 2
In [ ]:
# Conteo de transacciones con Amount=0 por clase
amount_zero_fraud = fraudulent_data[fraudulent_data['Amount'] == 0].shape[0]
amount_zero_nonfraud = data[(data['Class'] == 0) & \
                      (data['Amount'] == 0)].shape[0]

# Porcentajes relativos
print("Transacciones fraudulentas con Amount=0:", amount_zero_fraud)
print("Transacciones no fraudulentas con Amount=0:", amount_zero_nonfraud)
print("Porcentaje en fraudulentas:", \
                      (amount_zero_fraud / len(fraudulent_data)) * 100, "%")

print("Porcentaje en no fraudulentas:", \
 (amount_zero_nonfraud / len(data[data['Class'] == 0])) * 100, "%")
Transacciones fraudulentas con Amount=0: 27
Transacciones no fraudulentas con Amount=0: 1798
Porcentaje en fraudulentas: 5.487804878048781 %
Porcentaje en no fraudulentas: 0.6323971651161564 %

👁️ Observación:

  • Mantener las transacciones con Amount = 0:
    • Aunque hay menos transacciones fraudulentas con Amount = 0 en términos absolutos, la proporción más alta dentro de las transacciones fraudulentas hace que sea relevante mantener estos datos en el análisis.

    • Podrían representar un patrón importante, como intentos de fraude con transacciones pequeñas o pruebas de tarjeta.


In [ ]:
# Verificando el estadístico de la variable monto para transacciones
# fraudulenas
fraudulent_data[['Amount']].describe()
Out[ ]:
Amount
count 492.00000000
mean 122.21132114
std 256.68328830
min 0.00000000
25% 1.00000000
50% 9.25000000
75% 105.89000000
max 2125.87000000
In [ ]:
# Verificando el estadístico de la variable monto para transacciones legítimas
# (No-Fraudulentas)
non_fraudulent_data[['Amount']].describe()
Out[ ]:
Amount
count 284315.00000000
mean 88.29102242
std 250.10509223
min 0.00000000
25% 5.65000000
50% 22.00000000
75% 77.05000000
max 25691.16000000
In [ ]:
# Estadísticas descriptivas del dataset
print(data.describe().T)
                 count           mean            std           min  \
Time   284807.00000000 94813.85957508 47488.14595457    0.00000000   
V1     284807.00000000     0.00000000     1.95869580  -56.40750963   
V2     284807.00000000     0.00000000     1.65130858  -72.71572756   
V3     284807.00000000    -0.00000000     1.51625501  -48.32558936   
V4     284807.00000000     0.00000000     1.41586857   -5.68317120   
V5     284807.00000000     0.00000000     1.38024673 -113.74330671   
V6     284807.00000000     0.00000000     1.33227109  -26.16050594   
V7     284807.00000000    -0.00000000     1.23709360  -43.55724157   
V8     284807.00000000     0.00000000     1.19435290  -73.21671846   
V9     284807.00000000    -0.00000000     1.09863209  -13.43406632   
V10    284807.00000000     0.00000000     1.08884977  -24.58826244   
V11    284807.00000000     0.00000000     1.02071303   -4.79747346   
V12    284807.00000000    -0.00000000     0.99920139  -18.68371463   
V13    284807.00000000     0.00000000     0.99527423   -5.79188121   
V14    284807.00000000     0.00000000     0.95859561  -19.21432549   
V15    284807.00000000     0.00000000     0.91531601   -4.49894468   
V16    284807.00000000     0.00000000     0.87625289  -14.12985452   
V17    284807.00000000    -0.00000000     0.84933706  -25.16279937   
V18    284807.00000000     0.00000000     0.83817621   -9.49874592   
V19    284807.00000000     0.00000000     0.81404050   -7.21352743   
V20    284807.00000000     0.00000000     0.77092502  -54.49772049   
V21    284807.00000000     0.00000000     0.73452401  -34.83038214   
V22    284807.00000000    -0.00000000     0.72570156  -10.93314370   
V23    284807.00000000     0.00000000     0.62446030  -44.80773520   
V24    284807.00000000     0.00000000     0.60564707   -2.83662692   
V25    284807.00000000     0.00000000     0.52127807  -10.29539707   
V26    284807.00000000     0.00000000     0.48222701   -2.60455055   
V27    284807.00000000    -0.00000000     0.40363249  -22.56567932   
V28    284807.00000000    -0.00000000     0.33008326  -15.43008391   
Amount 284807.00000000    88.34961925   250.12010924    0.00000000   
Class  284807.00000000     0.00172749     0.04152719    0.00000000   

                  25%            50%             75%             max  
Time   54201.50000000 84692.00000000 139320.50000000 172792.00000000  
V1        -0.92037338     0.01810880      1.31564169      2.45492999  
V2        -0.59854991     0.06548556      0.80372387     22.05772899  
V3        -0.89036484     0.17984634      1.02719554      9.38255843  
V4        -0.84864012    -0.01984653      0.74334129     16.87534403  
V5        -0.69159707    -0.05433583      0.61192644     34.80166588  
V6        -0.76829561    -0.27418708      0.39856490     73.30162555  
V7        -0.55407588     0.04010308      0.57043607    120.58949395  
V8        -0.20862974     0.02235804      0.32734586     20.00720837  
V9        -0.64309757    -0.05142873      0.59713903     15.59499461  
V10       -0.53542573    -0.09291738      0.45392345     23.74513612  
V11       -0.76249420    -0.03275735      0.73959341     12.01891318  
V12       -0.40557149     0.14003259      0.61823803      7.84839208  
V13       -0.64853930    -0.01356806      0.66250496      7.12688296  
V14       -0.42557401     0.05060132      0.49314985     10.52676605  
V15       -0.58288428     0.04807155      0.64882081      8.87774160  
V16       -0.46803677     0.06641332      0.52329631     17.31511152  
V17       -0.48374831    -0.06567575      0.39967498      9.25352625  
V18       -0.49884980    -0.00363631      0.50080675      5.04106919  
V19       -0.45629892     0.00373482      0.45894936      5.59197143  
V20       -0.21172136    -0.06248109      0.13304084     39.42090425  
V21       -0.22839495    -0.02945017      0.18637720     27.20283916  
V22       -0.54235037     0.00678194      0.52855364     10.50309009  
V23       -0.16184635    -0.01119293      0.14764206     22.52841169  
V24       -0.35458614     0.04097606      0.43952660      4.58454914  
V25       -0.31714505     0.01659350      0.35071556      7.51958868  
V26       -0.32698393    -0.05213911      0.24095217      3.51734561  
V27       -0.07083953     0.00134215      0.09104512     31.61219811  
V28       -0.05295979     0.01124383      0.07827995     33.84780782  
Amount     5.60000000    22.00000000     77.16500000  25691.16000000  
Class      0.00000000     0.00000000      0.00000000      1.00000000  
In [ ]:
# Calcular la matriz de correlación
correlation_matrix = data.corr()

# Oculta los valores redundantes de la matriz de correlación, mostrando solo la
# mitad inferior.
#mask = np.triu(np.ones_like(correlation_matrix, dtype=bool))

# Visualizar la matriz de correlación
plt.figure(figsize=(12, 8))
sns.heatmap(correlation_matrix,
            #mask=mask,
            annot=False,
            cmap='coolwarm',
            linewidths=0.5,
            fmt='.2f')
plt.title("Matriz de Correlación para Datos Desbalanceados")
plt.show()
No description has been provided for this image

Matriz de Correlación - Clase Fraudulenta (Class = 1)

  • Analizamos si hay correlación entre las variables solamente para las clases fraudulentas.
In [ ]:
# ================================
# Filtrar solo los registros de fraude (Class = 1)
# ================================
fraudulent_data = data[data['Class'] == 1]

# ================================
# Calcular la matriz de correlación solo para registros de fraude
# ================================
correlation_matrix  = fraudulent_data.corr()

# ================================
# Crear un heatmap con valores legibles
# ================================
plt.figure(figsize=(14, 12))  # Ajustar tamaño del gráfico
sns.heatmap(
    correlation_matrix, # Matriz de correlación filtrada
    annot=True,         # Mostrar valores numéricos en las celdas
    fmt=".2f",          # Formato para mostrar solo 2 decimales
    #cmap='YlGnBu',      # Paleta de colores amarillo-verde-azul
    cmap='coolwarm',    # Paleta de colores azul-rojo
    cbar=True,          # Mostrar la barra de colores
    linewidths=0.5,     # Separación entre celdas
    annot_kws={"size": 8}  # Reducir tamaño del texto de anotaciones
)

# ================================
# Personalización del gráfico
# ================================
plt.title("Correlación de Variables (Solo Registros de Fraude: Class = 1)", fontsize=14)
#plt.xticks(rotation=45)  # Rotar etiquetas del eje X para mejor visibilidad
plt.xticks(rotation=90, ha='center')  # Rotar etiquetas del eje X a 90°
plt.yticks(rotation=0)                # Etiquetas del eje Y horizontales

plt.show()
No description has been provided for this image

Identificar pares de variables con alta correlación (umbral > 0.7 o < -0.7)

In [ ]:
# Filtrar pares con alta correlación (umbral > 0.7 o < -0.7)
correlated_pairs = correlation_matrix.abs().unstack().sort_values(ascending=False)
high_correlation_pairs = correlated_pairs[(correlated_pairs > 0.7) & (correlated_pairs < 1.0)]

print(high_correlation_pairs)
V17  V18   0.97149216
V18  V17   0.97149216
V17  V16   0.96015330
V16  V17   0.96015330
V18  V16   0.94449768
V16  V18   0.94449768
V1   V3    0.90787501
V3   V1    0.90787501
V1   V7    0.89760878
V7   V1    0.89760878
V1   V5    0.89496833
V5   V1    0.89496833
V12  V11   0.88971960
V11  V12   0.88971960
V3   V5    0.88368938
V5   V3    0.88368938
V3   V7    0.88231242
V7   V3    0.88231242
V12  V16   0.88170346
V16  V12   0.88170346
V3   V2    0.87690369
V2   V3    0.87690369
V9   V10   0.86396596
V10  V9    0.86396596
V7   V2    0.86298308
V2   V7    0.86298308
V11  V14   0.86204427
V14  V11   0.86204427
V10  V7    0.85982293
V7   V10   0.85982293
V21  V22   0.85211151
V22  V21   0.85211151
V12  V17   0.84279621
V17  V12   0.84279621
V7   V5    0.83973777
V5   V7    0.83973777
V10  V12   0.83433069
V12  V10   0.83433069
V2   V5    0.82839063
V5   V2    0.82839063
V2   V1    0.81922580
V1   V2    0.81922580
V4   V9    0.81885318
V9   V4    0.81885318
V10  V3    0.81824320
V3   V10   0.81824320
V16  V10   0.80663705
V10  V16   0.80663705
     V17   0.80310922
V17  V10   0.80310922
V12  V14   0.79982192
V14  V12   0.79982192
V18  V12   0.79375348
V12  V18   0.79375348
V18  V10   0.78650801
V10  V18   0.78650801
V4   V12   0.77876419
V12  V4    0.77876419
V5   V10   0.76458898
V10  V5    0.76458898
V9   V7    0.75472517
V7   V9    0.75472517
V11  V16   0.75449174
V16  V11   0.75449174
V18  V5    0.74559856
V5   V18   0.74559856
V8   V6    0.74302050
V6   V8    0.74302050
V2   V10   0.74108525
V10  V2    0.74108525
V3   V9    0.73320790
V9   V3    0.73320790
V7   V18   0.73140577
V18  V7    0.73140577
V10  V4    0.72697113
V4   V10   0.72697113
V5   V17   0.72379883
V17  V5    0.72379883
V3   V4    0.72376562
V4   V3    0.72376562
V11  V4    0.72168188
V4   V11   0.72168188
V10  V11   0.71750749
V11  V10   0.71750749
V17  V9    0.71388678
V9   V17   0.71388678
V12  V9    0.71058370
V9   V12   0.71058370
V10  V1    0.71011195
V1   V10   0.71011195
V7   V17   0.70308740
V17  V7    0.70308740
     V11   0.70265071
V11  V17   0.70265071
dtype: float64

👁️ Observación:

Filtrar pares con alta correlación (umbral > 0.7 o < -0.7):

El umbral de correlación > 0.7 o < -0.7 es un valor comúnmente utilizado en análisis estadístico y machine learning porque:

Correlación alta

  • Un valor mayor que 0.7 o menor que -0.7 indica una relación lineal fuerte entre las variables.
  • Esto puede implicar redundancia o multicolinealidad en un modelo.
  • Las relaciones lineales fuertes pueden afectar negativamente el rendimiento de ciertos modelos, como:
    • Regresión lineal
    • Modelos basados en distancia (ej., KNN).

VIF (Variance Inflation Factor)

VIF mide la multicolinealidad entre las características. Si dos variables están altamente correlacionadas, una de ellas puede ser eliminada.

  • VIF > 10: Alta multicolinealidad → Eliminar o combinar la variable.
  • VIF entre 5-10: Requiere análisis adicional.
  • VIF < 5: Multicolinealidad aceptable.
In [ ]:
from statsmodels.stats.outliers_influence import variance_inflation_factor

# Seleccionar las columnas no transformadas
X_vif = fraudulent_data[['Time', 'Amount']]

# Calcular VIF para cada característica
vif = pd.DataFrame()
vif['Feature'] = X_vif.columns
vif['VIF'] = [variance_inflation_factor(X_vif.values, i) for i in range(X_vif.shape[1])]

print("VIF por característica:")
print(vif)
VIF por característica:
  Feature        VIF
0    Time 1.18228779
1  Amount 1.18228779

Importancia de características usando correlación sobre el datase "data"

Verificamos si hay correlación absoluta con la variable objetivo antes de realizar la limpieza de los datos.

In [ ]:
# Correlación: Cuando tienes datos numéricos y buscas una relación lineal.
%%time

# Calcular la correlación absoluta con la variable objetivo
correlations = data.drop(columns='Class').corrwith(data['Class']).abs()

# Ordenar características por correlación
correlations = correlations.sort_values(ascending=False)

# Graficar la importancia de las características
plt.figure(figsize=(10, 6))
correlations.plot(kind='bar', color='skyblue')
plt.title('Importancia de características basada en correlación con la clase')
plt.xlabel('Características')
plt.ylabel('Correlación absoluta')
plt.xticks(rotation=90)
plt.tight_layout()
plt.show()
No description has been provided for this image
CPU times: user 952 ms, sys: 71.3 ms, total: 1.02 s
Wall time: 2 s

📊 Visualización Inicial de Datos¶

En esta sección hacemos una exploración visual de los datos en busca de patrones y problemas que deben ser corregidos antes de trabajar con los modelos.

Preguntas de Negocio¶

Histogramas y gráficas de correlación relevantes para entender la distribución de los datos.

🤔 Pregunta 1: ¿Cuántas transacciones fraudulentas hay en comparación con las no fraudulentas? (Utiliza un gráfico de barras)

In [ ]:
print('Cantidad Transacciones no-fraudulentas (0):',
                                          non_fraudulent_data.shape[0])

print('Cantidad Transacciones fraudulentas (1):', fraudulent_data.shape[0])
Cantidad Transacciones no-fraudulentas (0): 284315
Cantidad Transacciones fraudulentas (1): 492
In [ ]:
# Muestra la distribución de las transacciones fraudulentas con respecto de las
# no fraudulentas

#colors = ["blue", "red"]
colors = ["#87CEEB", "#FF6347"]

# Configurar la visualización en una fila
fig, axes = plt.subplots(1, figsize=(10, 6))

# Función para agregar etiquetas dentro de las barras
def add_labels(ax):
    for p in ax.patches:
        ax.annotate(f'{int(p.get_height())}', (p.get_x() + p.get_width() / 2,
                                               p.get_height()),
                    ha='center', va='bottom', fontsize=12, color='black',
                    weight='bold')

# Cuenta las veces que ocurre cada clase (0: no fraude, 1: fraude)
ax1 = sns.countplot(x='Class', data=data, palette=colors)
ax1.set_title('Transaciones fraudulentas vs no fraudulentas', fontsize=14,
              fontweight='bold')

ax1.set_xlabel("Tipo de Transacción")
ax1.set_ylabel("Cantidad de transacciones")
add_labels(ax1)

plt.xticks([0, 1], ['No-Fraude', 'Fraude'])
plt.show()
No description has been provided for this image

🤔 Pregunta 2: ¿Cuál es la distribución de los importes y tiempos de las transacciones Fraudelentas y No-Fraudulentas? ¿Siguen una distribución normal?

In [ ]:
# 1. Histogramas y KDE Plots

# Configurar el tamaño de la figura con subplots en una cuadrícula de 2x2
fig, axs = plt.subplots(2, 2, figsize=(16, 8))

# Histograma y curva de densidad para Amount en transacciones fraudulentas
sns.histplot(fraudulent_data['Amount'], bins=30, edgecolor='black',
             color='salmon', kde=True, ax=axs[0, 0])
axs[0, 0].set_xlabel('Importe de la transacción')
axs[0, 0].set_ylabel('Cantidad de Transacciones')
axs[0, 0].set_title('Distribución de los importes de las transacciones \
fraudulentas', fontsize=14, fontweight='bold')

# Histograma y curva de densidad para Amount en transacciones no fraudulentas
sns.histplot(non_fraudulent_data['Amount'], bins=30, edgecolor='black',
             color='skyblue', kde=True, ax=axs[0, 1])
axs[0, 1].set_xlabel('Importe de la transacción')
axs[0, 1].set_ylabel('Cantidad de Transacciones')
axs[0, 1].set_title('Distribución de los importes de las transacciones \
no-fraudulentas', fontsize=14, fontweight='bold')

# Evitar notación científica en el eje y
axs[0, 1].ticklabel_format(style='plain', axis='y')

# Histograma y curva de densidad para Time en transacciones fraudulentas
sns.histplot(fraudulent_data['Time'], bins=30, edgecolor='black',
             color='lightcoral', kde=True, ax=axs[1, 0])
axs[1, 0].set_xlabel('Tiempo')
axs[1, 0].set_ylabel('Cantidad de Transacciones')
axs[1, 0].set_title('Distribución de los tiempos en las transacciones \
fraudulentas', fontsize=14, fontweight='bold')

# Histograma y curva de densidad para Time en transacciones no fraudulentas
sns.histplot(non_fraudulent_data['Time'], bins=30, edgecolor='black',
             color='lightblue', kde=True, ax=axs[1, 1])
axs[1, 1].set_xlabel('Tiempo')
axs[1, 1].set_ylabel('Cantidad de Transacciones')
axs[1, 1].set_title('Distribución de los tiempos en las transacciones \
no-fraudulentas', fontsize=14, fontweight='bold')

# Ajustar la presentación para que las gráficas no se solapen
plt.tight_layout()
plt.show()
No description has been provided for this image

👁️ Observación:
Los histogramas presentan la distribución de las variables Amount y Time tanto para transacciones fraudulentas como para no fraudulentas.

  • Amount:

    • Las transacciones fraudulentas y no-fraudulentas suelen tener montos bajos, aunque la cantidad de transacciones no-fraudulentas es significativamente mayor. Esta diferencia en la frecuencia puede ser un factor clave al momento de construir un modelo de detección de fraudes.
  • Time:

    • Los fraudes no parecen concentrarse en horarios específicos, mientras que las transacciones no-fraudulentas muestran patrones temporales más definidos.

    Este análisis sugiere que para detectar fraudes, los montos bajos deben ser considerados con atención. Además, la variable Time podría tener un menor impacto en la detección de fraudes, ya que no presenta un patrón claro para distinguir entre transacciones fraudulentas y no fraudulentas.


In [ ]:
# 2. Gráfico Q-Q (Quantile-Quantile Plot)
import scipy.stats as stats

# Configurar la figura con subplots en una cuadrícula de 2x2
fig, axs = plt.subplots(2, 2, figsize=(16, 8))

# Q-Q Plot para Amount en transacciones fraudulentas
stats.probplot(fraudulent_data['Amount'], dist="norm", plot=axs[0, 0])
axs[0, 0].set_title("Q-Q Plot for Amount (Fraudulent)", fontsize=14,
                    fontweight='bold')

# Q-Q Plot para Amount en transacciones no fraudulentas
stats.probplot(non_fraudulent_data['Amount'], dist="norm", plot=axs[0, 1])
axs[0, 1].set_title("Q-Q Plot for Amount (Non-Fraudulent)", fontsize=14,
                    fontweight='bold')

# Q-Q Plot para Time en transacciones fraudulentas
stats.probplot(fraudulent_data['Time'], dist="norm", plot=axs[1, 0])
axs[1, 0].set_title("Q-Q Plot for Time (Fraudulent)", fontsize=14,
                    fontweight='bold')

# Q-Q Plot para Time en transacciones no fraudulentas
stats.probplot(non_fraudulent_data['Time'], dist="norm", plot=axs[1, 1])
axs[1, 1].set_title("Q-Q Plot for Time (Non-Fraudulent)", fontsize=14,
                    fontweight='bold')

# Ajuste para que las gráficas no se solapen
plt.tight_layout()
plt.show()
No description has been provided for this image

👁️ Observación:
Estas gráficas sugieren que tanto Amount como Time no siguen una distribución normal, lo cual es importante al seleccionar métodos de análisis y modelos de machine learning que asuman normalidad..

  • Amount:

    • Amount no sigue una distribución normal en ninguno de los dos casos (fraudulento y no-fraudulento). Esto es esperado en transacciones monetarias, ya que las cantidades suelen estar sesgadas hacia valores bajos, con algunas transacciones en valores altos que representan eventos raros.
  • Time:

    • Time muestra un ajuste más cercano a una distribución normal en el centro, pero se desvía en las colas. Esto puede ser suficiente para ciertos modelos que no requieren una distribución estrictamente normal, pero debes considerar esta información si planeas aplicar técnicas estadísticas que dependen de la normalidad.

In [ ]:
# 3. Pruebas Estadísticas (Shapiro-Wilk o Kolmogorov-Smirnov)

# Nivel de significancia
alpha = 0.05

# Prueba de Shapiro-Wilk para Amount en transacciones fraudulentas
stat, p_value = shapiro(fraudulent_data['Amount'])
print("Amount (Fraudulent) - Estadístico:", stat, ", p-value:", p_value)
if p_value > alpha:
    print("Amount (Fraudulent) sigue una distribución normal.")
else:
    print("Amount (Fraudulent) no sigue una distribución normal.")

# Prueba de Shapiro-Wilk para Amount en transacciones no fraudulentas
stat, p_value = shapiro(non_fraudulent_data['Amount'])
print("Amount (Non-Fraudulent) - Estadístico:", stat, ", p-value:", p_value)
if p_value > alpha:
    print("Amount (Non-Fraudulent) sigue una distribución normal.")
else:
    print('Amount (Non-Fraudulent) no sigue una distribución normal. \n')

# Prueba de Shapiro-Wilk para Time en transacciones fraudulentas
stat, p_value = shapiro(fraudulent_data['Time'])
print("Time (Fraudulent) - Estadístico:", stat, ", p-value:", p_value)
if p_value > alpha:
    print("Time (Fraudulent) sigue una distribución normal.")
else:
    print("Time (Fraudulent) no sigue una distribución normal.")

# Prueba de Shapiro-Wilk para Time en transacciones no fraudulentas
stat, p_value = shapiro(non_fraudulent_data['Time'])
print("Time (Non-Fraudulent) - Estadístico:", stat, ", p-value:", p_value)
if p_value > alpha:
    print("Time (Non-Fraudulent) sigue una distribución normal.")
else:
    print("Time (Non-Fraudulent) no sigue una distribución normal.")
Amount (Fraudulent) - Estadístico: 0.5253348984091436 , p-value: 2.8247332909327927e-34
Amount (Fraudulent) no sigue una distribución normal.
Amount (Non-Fraudulent) - Estadístico: 0.30965871086088315 , p-value: 6.851464669833563e-199
Amount (Non-Fraudulent) no sigue una distribución normal. 

Time (Fraudulent) - Estadístico: 0.9426319474778967 , p-value: 7.454733029562764e-13
Time (Fraudulent) no sigue una distribución normal.
Time (Non-Fraudulent) - Estadístico: 0.9400034429247126 , p-value: 7.379668277117668e-119
Time (Non-Fraudulent) no sigue una distribución normal.

🤔 Pregunta 3: "¿Existen datos atípicos en el dataset? ¿Cómo están distribuidos estos valores?"

In [ ]:
# Visualizar distribuciones de Amount y las variables PCA

%%time

# Configurar la cantidad de columnas en los subplots
columns = data.columns
n_cols = 4  # Número de gráficos por fila
n_rows = -(-len(columns) // n_cols)  # Calcular el número de filas necesarias

# Crear la figura y los subplots
fig, axes = plt.subplots(n_rows, n_cols, figsize=(20, n_rows * 4))
axes = axes.flatten()  # Aplanar los ejes para iterar fácilmente

# Crear un boxplot para cada columna
for i, col in enumerate(columns):
    sns.boxplot(data=data[col], ax=axes[i], color="skyblue")
    axes[i].set_title(f"Boxplot de {col}", fontsize=12)
    axes[i].set_xlabel("Valores")
    axes[i].set_ylabel("")

# Ocultar subplots vacíos si hay más subplots que columnas
for j in range(i + 1, len(axes)):
    axes[j].axis('off')

# Ajustar el espacio entre subplots
plt.tight_layout()
plt.show()
No description has been provided for this image
CPU times: user 25.9 s, sys: 546 ms, total: 26.4 s
Wall time: 29.4 s

👁️ Observación:

  1. Variables (V1 a V28):

    • La mayoría tienen valores atípicos (outliers), representados por puntos fuera del rango de los bigotes.
    • Las distribuciones tienden a centrarse alrededor de cero, lo cual es común en variables transformadas, como ocurre en datos procesados con PCA.
  2. Time:

    • Presenta una distribución más amplia y uniforme sin tantos valores atípicos, indicando que es una variable temporal o secuencial.
  3. Amount:

    • Muestra valores atípicos significativos. Las transacciones con montos elevados son claras candidatas a ser anomalías (fraudes), lo cual concuerda con el análisis de fraude.
  4. Class:

    • Aquí se confirma la desbalanceada distribución entre clases (0: no fraude, 1: fraude).
      La cantidad de transacciones fraudulentas es mínima comparada con las no fraudulentas, lo que afecta al modelo y requiere técnicas para equilibrar las clases.

🤔 Pregunta 4: ¿En qué rangos de importes (Amount) se concentran proporcionalmente más las transacciones fraudulentas en comparación con las no-fraudulentas, y qué implicaciones podría tener esto para mejorar los sistemas de detección de fraude?

  • Analizar si ciertos rangos de importes tienen un comportamiento inusual en términos de fraude.

  • Mejorar los sistemas de monitoreo: Ajustar los algoritmos de detección para prestar más atención a esos rangos específicos.

In [ ]:
%%time
# Crear rangos de valores para Amount
bins = [0, 10, 50, 100, 500, 1000, 5000]  # Define los rangos

# Etiquetas para los rangos
labels = ['0-10', '10-50', '50-100', '100-500', '500-1000', '1000-5000']

data['Amount_range'] = pd.cut(data['Amount'], bins=bins, labels=labels,
                              include_lowest=True)

# Contar las transacciones por rango y clase
range_class_counts = data.groupby(['Amount_range',
                                   'Class']).size().unstack(fill_value=0)

# Calcular proporciones relativas por clase
total_counts = data['Class'].value_counts()
range_proportions = range_class_counts.div(total_counts, axis=1)

# Preparar los datos para el gráfico
range_proportions.reset_index(inplace=True)
range_proportions = range_proportions.melt(id_vars='Amount_range',
                                           var_name='Clase',
                                           value_name='Proporción')

# Cambiar las etiquetas de clase a 'No Fraude' y 'Fraude'
range_proportions['Clase'] = range_proportions['Clase'].replace({0: 'No Fraude',
                                                                 1: 'Fraude'})

# Crear el gráfico de barras dobles
plt.figure(figsize=(10, 6))
ax = sns.barplot(
    x='Amount_range',
    y='Proporción',
    hue='Clase',
    data=range_proportions,
    #palette=['blue', 'red']
    palette=['#87CEEB', '#FF6347']
)

# Añadir etiquetas de valores sobre las barras con formato de porcentaje
for container in ax.containers:
    ax.bar_label(
        container,
        fmt=lambda x: f"{x * 100:.0f}%",  # Convertir a porcentaje y redondear
        label_type='edge',
        color='black',
        fontsize=10,
        fontweight='bold'
    )

# Configurar títulos y etiquetas
plt.title('Proporciones de rangos de importes entre \n transacciones \
Fraudulentas y No-fraudulentas', fontsize=16, fontweight='bold')

plt.xlabel('Rangos de Importe (Amount)', fontsize=14)
plt.ylabel('Proporción', fontsize=14)
plt.xticks(fontsize=12, rotation=45)  # Rotar etiquetas del eje X

# Configurar el eje Y con formato de porcentaje
yticks = plt.gca().get_yticks()  # Obtener las posiciones de los ticks actuales
plt.gca().set_yticks(yticks)  # Establecer las posiciones
plt.gca().set_yticklabels([f"{int(tick * 100)}%" for tick in yticks],
                          fontsize=12)  # Formatear como porcentaje

# Ajustar la leyenda
plt.legend(title='Tipo de Transacción', title_fontsize=14, fontsize=12)

# Mostrar la gráfica
plt.tight_layout()
plt.show()
No description has been provided for this image
CPU times: user 584 ms, sys: 117 ms, total: 700 ms
Wall time: 715 ms

🤔 Pregunta 5: ¿Los importes (Amount) igual a cero se concentran proporcionalmente más en las transacciones fraudulentas o en las no-fraudulentas?

In [ ]:
# Calculate proportions for transactions with Amount = 0
#proportion_amount_zero = data[data['Amount'] == 0]['Class'].value_counts(normalize=True) * 100

# Filtrar solo las transacciones donde Amount es igual a 0
amount_zero_data = data[data['Amount'] == 0]

# Contar cuántas transacciones con Amount = 0 hay en cada clase
count_amount_zero = amount_zero_data['Class'].value_counts()

# Obtener el total de transacciones en cada clase
total_count_per_class = data['Class'].value_counts()

# Calcular la proporción de Amount = 0 en cada clase
proportion_amount_zero = (count_amount_zero / total_count_per_class).fillna(0)

# Reemplazar etiquetas para mayor claridad
labels = {0: "(0) No-Fraude", 1: "(1) Fraude"}
proportion_amount_zero.index = [labels[i] for i in proportion_amount_zero.index]


# Configuración de estilo con Seaborn
sns.set(style="whitegrid")

# Datos de proporción
labels = ['No-Fraude (0)', 'Fraude (1)']
values = proportion_amount_zero.values
colors = ['#87CEEB', '#FF6347']  # Colores personalizados

# Crear gráfico de tarta (pizza)
fig, ax = plt.subplots(figsize=(8, 6))
wedges, texts, autotexts = ax.pie(
    values,
    labels=labels,
    autopct='%1.1f%%',  # Mostrar porcentajes
    startangle=140,
    colors=colors,
    wedgeprops=dict(edgecolor='white'),
    explode=(0.05, 0.05)  # Separar ligeramente las porciones
)

# Personalizar texto
for text in texts + autotexts:
    text.set_fontsize(12)
    text.set_color('black')
    text.set_fontweight('bold')

# Título del gráfico
plt.title("Proporción de Transacciones con Amount = 0 por Clase", fontsize=14, fontweight='bold')

# Agregar interacción con mplcursors (hover)
cursor = mplcursors.cursor(wedges, hover=True)
@cursor.connect("add")
def on_hover(sel):
    sel.annotation.set_text(f"{labels[sel.index]}\n{values[sel.index]:.2f}%")

# Mostrar el gráfico
plt.tight_layout()
plt.show()
No description has been provided for this image

👁️ Observación:

  • Mantener las transacciones con Amount = 0:
    • Aunque haya menos transacciones fraudulentas con Amount = 0 en términos absolutos, la proporción más alta dentro de las transacciones fraudulentas hace que sea relevante mantener estos datos en el análisis.

    • Podrían representar un patrón importante, como intentos de fraude con transacciones pequeñas o pruebas de tarjeta.


🔄 Preprocesamiento¶

Realiza varias operaciones de preprocesamiento y preparación de datos para aplicar modelos de aprendizaje automático.

🧹 Limpieza de datos¶

En esta sección, se realizam el manejo de valores nulos, eliminación de duplicados, escalado de datos numéricos, etc.

Distribución de la variable "Class" antes del proceso de limpieza¶

In [ ]:
# Distribución de transacciones Fraudulentas (0) y
# transacciones fraudulentas(1)
data['Class'].value_counts()
Out[ ]:
count
Class
0 284315
1 492

Valores perdidos¶

In [ ]:
# Verifica la cantidad de valores nulos en cada columna
data.isnull().sum()
Out[ ]:
0
Time 0
V1 0
V2 0
V3 0
V4 0
V5 0
V6 0
V7 0
V8 0
V9 0
V10 0
V11 0
V12 0
V13 0
V14 0
V15 0
V16 0
V17 0
V18 0
V19 0
V20 0
V21 0
V22 0
V23 0
V24 0
V25 0
V26 0
V27 0
V28 0
Amount 0
Class 0
Amount_range 55

In [ ]:
# Elimina los valores nulos (sí hubiera)
data.dropna(inplace=True)

Datos duplicados¶

In [ ]:
# Verifica la cantidad de filas duplicadas en el dataframe
duplicados = data.duplicated()
num_duplicados = duplicados.sum()

print(f"Número de filas duplicadas en el dataset: {num_duplicados}")
Número de filas duplicadas en el dataset: 1081
In [ ]:
# Verifica la cantidad de transacciones no-fraudulentas duplicadas en el dataframe
duplicados_nofraude = data[(data.duplicated()) & (data['Class'] == 0)].shape[0]
print(f"Transacciones Legítimas Duplicadas: {duplicados_nofraude}")
Transacciones Legítimas Duplicadas: 1062
In [ ]:
# Verifica la cantidad de transacciones fraudulentas duplicadas en el dataframe
duplicados_fraude = data[(data.duplicated()) & (data['Class'] == 1)].shape[0]
print(f"Transacciones Fraudulents Duplicadas: {duplicados_fraude}")
Transacciones Fraudulents Duplicadas: 19
In [ ]:
# Listar las filas duplicadas considerando todas las columnas
duplicados_completos = data[data.duplicated(keep=False)]

# Mostrar el número de duplicados completos
print(f"Número de transacciones idénticas en todas las columnas: ",
 {duplicados_completos.shape[0]})
print("Ejemplos de transacciones duplicadas idénticas:")

# Presenta las 5 primeras observaciones duplicadas
print(duplicados_completos.head())
Número de transacciones idénticas en todas las columnas:  {1854}
Ejemplos de transacciones duplicadas idénticas:
           Time          V1         V2         V3         V4         V5  \
32  26.00000000 -0.52991228 0.87389158 1.34724733 0.14545668 0.41420886   
33  26.00000000 -0.52991228 0.87389158 1.34724733 0.14545668 0.41420886   
34  26.00000000 -0.53538776 0.86526781 1.35107629 0.14757547 0.43368021   
35  26.00000000 -0.53538776 0.86526781 1.35107629 0.14757547 0.43368021   
112 74.00000000  1.03837033 0.12748613 0.18445589 1.10994979 0.44169890   

            V6          V7         V8          V9         V10        V11  \
32  0.10022309  0.71120608 0.17606596 -0.28671693 -0.48468768 0.87248959   
33  0.10022309  0.71120608 0.17606596 -0.28671693 -0.48468768 0.87248959   
34  0.08698294  0.69303931 0.17974226 -0.28564186 -0.48247447 0.87179958   
35  0.08698294  0.69303931 0.17974226 -0.28564186 -0.48247447 0.87179958   
112 0.94528253 -0.03671460 0.35099500  0.11894954 -0.24328924 0.57806260   

           V12         V13        V14         V15         V16         V17  \
32  0.85163586 -0.57174530 0.10097427 -1.51977183 -0.28437598 -0.31052358   
33  0.85163586 -0.57174530 0.10097427 -1.51977183 -0.28437598 -0.31052358   
34  0.85344743 -0.57182189 0.10225210 -1.51999120 -0.28591250 -0.30963339   
35  0.85344743 -0.57182189 0.10225210 -1.51999120 -0.28591250 -0.30963339   
112 0.67472982 -0.53423057 0.44660138  1.12288467 -1.76800051  1.24115696   

            V18         V19         V20        V21        V22         V23  \
32  -0.40424787 -0.82337352 -0.29034761 0.04694907 0.20810486 -0.18554835   
33  -0.40424787 -0.82337352 -0.29034761 0.04694907 0.20810486 -0.18554835   
34  -0.40390199 -0.82374299 -0.28326378 0.04952569 0.20653654 -0.18710807   
35  -0.40390199 -0.82374299 -0.28326378 0.04952569 0.20653654 -0.18710807   
112 -2.44949986 -1.74725517 -0.33551985 0.10251980 0.60508853  0.02309216   

            V24        V25         V26         V27        V28     Amount  \
32   0.00103066 0.09881570 -0.55290360 -0.07328808 0.02330705 6.14000000   
33   0.00103066 0.09881570 -0.55290360 -0.07328808 0.02330705 6.14000000   
34   0.00075301 0.09811661 -0.55347097 -0.07830550 0.02542738 1.77000000   
35   0.00075301 0.09811661 -0.55347097 -0.07830550 0.02542738 1.77000000   
112 -0.62646266 0.47912027 -0.16693684  0.08124672 0.00119158 1.18000000   

     Class Amount_range  
32       0         0-10  
33       0         0-10  
34       0         0-10  
35       0         0-10  
112      0         0-10  
In [ ]:
# Eliminar filas duplicadas
data = data.drop_duplicates()
In [ ]:
# Distribución de transacciones no-fraudulentas y transacciones fraudulentas
# después de eliminar duplicados
data['Class'].value_counts()
Out[ ]:
count
Class
0 283198
1 473

In [ ]:
# Numero total de Filas y Columnas después de la eliminación de los datos
# duplicados
data.shape
Out[ ]:
(283671, 32)

Corrección de Sesgos (Skewness) en Características Numéricas¶

En esta sección hacemos tratamiento de los sesgos skewness en todo el dataset "data".

Este proceso identifica columnas numéricas con alta asimetría en sus distribuciones y aplica transformaciones (logaritmo natural o raíz cuadrada) para reducir la skewness. Se comparan visualmente las distribuciones antes y después de la transformación, mostrando una mejora significativa en la simetría de los datos.

Esta corrección mejora el rendimiento y la estabilidad de los modelos de Machine Learning al preparar los datos en un formato más adecuado.

In [ ]:
%%time

from scipy.stats import skew

# ============================================
# Identificar características numéricas
# ============================================
numerical_features = data.select_dtypes(include=['float64', 'int64']).columns

# Excluir la columna 'Class' de las características
numerical_features = numerical_features.drop('Class', errors='ignore')

# ============================================
# Calcular skewness para todo el dataset
# ============================================
skew_values = data[numerical_features].apply(lambda x: skew(x.dropna()))
skew_df = pd.DataFrame({'Column': numerical_features, 'Skewness': skew_values})
skew_df = skew_df.sort_values(by='Skewness', ascending=False)

# Umbral para skewness significativa
threshold = 1
columns_with_high_skew = skew_df[abs(skew_df['Skewness']) > threshold]['Column'].tolist()

print(f"⚠️ Columnas con skewness alta (>|{threshold}|):")
print(columns_with_high_skew)

# Guardar valores originales solo para comparar en las gráficas
original_data = data[columns_with_high_skew].copy()

# ============================================
# Aplicar transformación de skewness en todo el dataset
# ============================================
for col in columns_with_high_skew:
    if (data[col] > 0).all():  # Si los valores son positivos
        data[col] = np.log1p(data[col])
    else:  # Si hay valores negativos o mixtos
        data[col] = np.sqrt(data[col] - data[col].min() + 1)

# ============================================
# Visualización: Antes y Después de la Transformación
# ============================================
fig, axes = plt.subplots(len(columns_with_high_skew), 2, figsize=(12, 5 * len(columns_with_high_skew)))
fig.suptitle("\nDistribución Antes y Después de la Transformación", fontsize=16)

for i, col in enumerate(columns_with_high_skew):
    # Distribución original
    axes[i, 0].hist(original_data[col].dropna(), bins=50, color='blue', alpha=0.7)
    axes[i, 0].set_title(f'Original: {col}')

    # Distribución transformada
    axes[i, 1].hist(data[col].dropna(), bins=50, color='green', alpha=0.7)
    axes[i, 1].set_title(f'Transformed: {col}')

plt.tight_layout()
plt.subplots_adjust(top=0.95)  # Ajustar el título principal
plt.show()

# ============================================
# Comparación de Skewness Antes y Después
# ============================================
skew_values_after = data[columns_with_high_skew].apply(lambda x: skew(x.dropna()))
comparison_df = pd.DataFrame({
    'Column': columns_with_high_skew,
    'Skewness_Before': skew_df.loc[columns_with_high_skew, 'Skewness'].values,
    'Skewness_After': skew_values_after.values
}).sort_values(by='Skewness_After', ascending=False)

print("\n✅ Comparación de Skewness Antes y Después de la Transformación:")
print(comparison_df)

# ============================================
# Validación Final: Distribución de la columna 'Class'
# ============================================
print("\n✅ Distribución de la columna 'Class' después de la transformación:")
print(data['Class'].value_counts())

# Validación Final: Estadísticas de la columna 'Amount'
print("\n✅ Estadísticas de la columna 'Amount' después de la transformación:")
print(data['Amount'].describe())
⚠️ Columnas con skewness alta (>|1|):
['V28', 'Amount', 'V21', 'V10', 'V6', 'V16', 'V7', 'V14', 'V3', 'V12', 'V20', 'V27', 'V1', 'V17', 'V2', 'V23', 'V8']
No description has been provided for this image
✅ Comparación de Skewness Antes y Después de la Transformación:
    Column  Skewness_Before  Skewness_After
1   Amount       7.96253473      2.61906002
0      V28      12.23872886      2.41836887
4       V6       1.10472746      0.52916589
3      V10       1.28136542     -0.19545718
2      V21       2.92668084     -1.24625006
5      V16      -1.09115200     -1.94859128
7      V14      -1.92183089     -3.35653552
8       V3      -1.98781931     -3.43521368
9      V12      -2.19822660     -3.56130787
12      V1      -3.17148678     -4.30654934
6       V7      -1.87565556     -4.43710734
11     V27      -3.00247374     -6.14314437
14      V2      -4.02038196     -6.45443785
10     V20      -2.82562203     -7.24348793
13     V17      -3.69088101     -7.74755919
15     V23      -4.79793620    -11.63566776
16      V8      -8.30897873    -13.15432032

✅ Distribución de la columna 'Class' después de la transformación:
Class
0    283198
1       473
Name: count, dtype: int64

✅ Estadísticas de la columna 'Amount' después de la transformación:
count   283671.00000000
mean         6.83089905
std          6.42913925
min          1.00000000
25%          2.56904652
50%          4.79583152
75%          8.85945822
max         70.43330178
Name: Amount, dtype: float64
CPU times: user 13.7 s, sys: 420 ms, total: 14.1 s
Wall time: 14.1 s

Detección de anomalías (método: Isolation Forest)¶

Para este dataset, aplicaremos Isolation Forest ya que es robusto en conjuntos de datos grandes y permite identificar anomalías en variables PCA y escaladas.

El uso del método de detección de anomalías puede beneficiar el proyecto al reducir la influencia de los valores atípicos, lo que potencialmente mejora el rendimiento y la capacidad de generalización del modelo en datos nuevos.

Aplicaremos esta técnica solamente para las transacciones No-Fraudulentas.

In [ ]:
# Aplicar Isolation Forest solo a transacciones no-fraudulentas

# Separar las transacciones no fraudulentas
non_fraud_data = data[data['Class'] == 0]

# Ajusta el parámetro 'contamination' según tus datos
iso = IsolationForest(contamination=0.01, random_state=42)
non_fraud_data['Anomaly'] = iso.fit_predict(non_fraud_data[['Amount']])

# Filtrar transacciones no anómalas
clean_non_fraud_data = non_fraud_data[non_fraud_data['Anomaly'] == 1 \
                                      ].drop('Anomaly', axis=1)

# Separar las transacciones fraudulentas (sin cambios)
fraud_data = data[data['Class'] == 1]

# Combinar datos limpios
clean_data = pd.concat([clean_non_fraud_data, fraud_data], axis=0)

# Verificar el resultado
print(f"Transacciones originales: {data.shape[0]}")
print(f"Transacciones después de la limpieza: {clean_data.shape[0]}")
Transacciones originales: 283671
Transacciones después de la limpieza: 280898

Regenerar gráficas con datos limpios¶

Usamos clean_data para generar las gráficas de proporciones y validar que los rangos aún reflejan patrones significativos.

In [ ]:
# Gráfica de proporciones por rangos de Amount:

# Crear el gráfico de barras dobles
plt.figure(figsize=(10, 6))
ax = sns.barplot(
    x='Amount_range',
    y='Proporción',
    hue='Clase',
    data=range_proportions,
    #palette=['blue', 'red']
    palette=['#87CEEB', '#FF6347']
)

# Añadir etiquetas de valores encima de las barras
for container in ax.containers:
    # Asegurarnos de mostrar valores pequeños redondeados
    for bar, value in zip(container, container.datavalues):
        if value > 0:  # Evitar mostrar etiquetas para proporciones de 0
            ax.text(
                bar.get_x() + bar.get_width() / 2,  # Posición en X
                bar.get_height() + 0.002, # Posición ligeramente encima de la barra
                f"{value:.0%}",  # Formato de porcentaje con una decimal
                ha='center',
                va='bottom',
                fontsize=12,
                color='black',
                fontweight='bold'
            )

# Configurar títulos y etiquetas
plt.title('Proporciones de rangos de importes entre \n transacciones '
          'Fraudulentas y No-fraudulentas (Limpias)',
          fontsize=16, fontweight='bold')

plt.xlabel('Rangos de Importe (Amount)', fontsize=14)
plt.ylabel('Proporción (%)', fontsize=14)
plt.xticks(fontsize=12, rotation=45)

# Configurar el eje Y para que muestre valores redondeados
yticks = plt.gca().get_yticks()  # Obtener los valores actuales del eje Y
plt.gca().set_yticks(yticks)  # Asegurar que los valores permanezcan
plt.gca().set_yticklabels([f"{tick:.0%}" for tick in yticks], fontsize=12) # Formatear como porcentajes

# Ajustar la leyenda
plt.legend(title='Tipo de Transacción', title_fontsize=12, fontsize=12)

# Mostrar la gráfica
plt.tight_layout()
plt.show()
No description has been provided for this image

Eliminación de columnas irrelevantes¶

In [ ]:
# Eliminar la columna Time
clean_data = clean_data.drop(['Time', 'Amount_range'], axis=1)
In [ ]:
# Numero de Filas y Columnas después de la eliminación de la columna Time
clean_data.shape
Out[ ]:
(280898, 30)
In [ ]:
# Verificar si la variable Time ha sido eliminada del dataset
print(clean_data.head())
          V1         V2         V3          V4          V5         V6  \
0 7.48650135 8.58154685 6.10060085  1.37815522 -0.33832077 5.25574863   
1 7.65502232 8.60127190 5.90317407  0.44815408  0.06001765 5.20366651   
2 7.48659840 8.50738294 6.03773081  0.37977959 -0.50319813 5.38154302   
3 7.51273838 8.57499280 6.03936895 -0.86329128 -0.01030888 5.32988828   
4 7.49995177 8.63675080 6.01911138  0.40303393 -0.40719338 5.22076885   

          V7         V8          V9        V10         V11        V12  \
0 6.69304416 8.62063898  0.36378697 5.06745070 -0.55159953 4.36645323   
1 6.66921574 8.61985035 -0.25542513 5.04195280  1.61272666 4.55510153   
2 6.73414453 8.62927542 -1.51465432 5.07896695  0.62450146 4.44407452   
3 6.69289553 8.63679074 -1.38702406 5.05304963 -0.22648726 4.45667397   
4 6.71938854 8.59919681  0.81773931 5.13238121 -0.82284288 4.49687783   

          V13        V14         V15        V16        V17         V18  \
0 -0.99138985 4.46129534  1.46817697 3.82876664 5.13524786  0.02579058   
1  0.48909502 4.48001710  0.63555809 3.94889498 5.10372361 -0.18336127   
2  0.71729273 4.47754169  2.34586495 3.49853846 5.22233365 -0.12135931   
3  0.50775687 4.46389984 -0.63141812 3.75102750 5.04764367  1.96577500   
4  1.34585159 4.36974320  0.17512113 3.83124070 5.09173508 -0.03819479   

          V19        V20        V21         V22        V23         V24  \
0  0.40399296 5.14526553 5.98431912  0.27783758 6.12825637  0.06692807   
1 -0.14578304 5.11402602 5.96695960 -0.63867195 6.14550959 -0.33984648   
2 -2.26185710 5.17178160 6.00652814  0.77167940 6.21091075 -0.68928096   
3 -1.23262197 5.10042228 5.97679527  0.00527360 6.12173828 -1.17557533   
4  0.80348692 5.16051234 5.98506069  0.79827849 6.12605436  0.14126698   

          V25         V26        V27        V28      Amount  Class  
0  0.12853936 -0.18911484 4.86818628 3.56228053 12.27273401      0  
1  0.16717040  0.12589453 4.85352410 3.56729867  1.92093727      0  
2 -0.32764183 -0.13909657 4.84874484 3.55684464 19.48486592      0  
3  0.64737603 -0.22192884 4.86090549 3.57384293 11.15795680      0  
4 -0.20600959  0.50229222 4.87699719 3.59528146  8.42555636      0  

👁️Observación:

  • Eliminar la columna Time, ya que no tiene relación significativa con la variable objetivo y no es comparable con las variables transformadas por PCA.

Escalado de Características¶

  • Usamos RobustScaler() para escalar Amount. Esto es útil porque RobustScaler utiliza la mediana y el rango intercuartílico (IQR), lo que lo hace menos sensible a los outliers. Esto es especialmente importante para Amount, ya que identificamos la presencia de valores atípicos en esta variable.
In [ ]:
# Escalado de la variable Amount usando RobustScaler
robust_scaler = RobustScaler()
clean_data[['Amount']] = robust_scaler.fit_transform(clean_data[['Amount']])
In [ ]:
# Verificar si la variable Amount ha sido escalada correctamente
print(clean_data.head())
          V1         V2         V3          V4          V5         V6  \
0 7.48650135 8.58154685 6.10060085  1.37815522 -0.33832077 5.25574863   
1 7.65502232 8.60127190 5.90317407  0.44815408  0.06001765 5.20366651   
2 7.48659840 8.50738294 6.03773081  0.37977959 -0.50319813 5.38154302   
3 7.51273838 8.57499280 6.03936895 -0.86329128 -0.01030888 5.32988828   
4 7.49995177 8.63675080 6.01911138  0.40303393 -0.40719338 5.22076885   

          V7         V8          V9        V10         V11        V12  \
0 6.69304416 8.62063898  0.36378697 5.06745070 -0.55159953 4.36645323   
1 6.66921574 8.61985035 -0.25542513 5.04195280  1.61272666 4.55510153   
2 6.73414453 8.62927542 -1.51465432 5.07896695  0.62450146 4.44407452   
3 6.69289553 8.63679074 -1.38702406 5.05304963 -0.22648726 4.45667397   
4 6.71938854 8.59919681  0.81773931 5.13238121 -0.82284288 4.49687783   

          V13        V14         V15        V16        V17         V18  \
0 -0.99138985 4.46129534  1.46817697 3.82876664 5.13524786  0.02579058   
1  0.48909502 4.48001710  0.63555809 3.94889498 5.10372361 -0.18336127   
2  0.71729273 4.47754169  2.34586495 3.49853846 5.22233365 -0.12135931   
3  0.50775687 4.46389984 -0.63141812 3.75102750 5.04764367  1.96577500   
4  1.34585159 4.36974320  0.17512113 3.83124070 5.09173508 -0.03819479   

          V19        V20        V21         V22        V23         V24  \
0  0.40399296 5.14526553 5.98431912  0.27783758 6.12825637  0.06692807   
1 -0.14578304 5.11402602 5.96695960 -0.63867195 6.14550959 -0.33984648   
2 -2.26185710 5.17178160 6.00652814  0.77167940 6.21091075 -0.68928096   
3 -1.23262197 5.10042228 5.97679527  0.00527360 6.12173828 -1.17557533   
4  0.80348692 5.16051234 5.98506069  0.79827849 6.12605436  0.14126698   

          V25         V26        V27        V28      Amount  Class  
0  0.12853936 -0.18911484 4.86818628 3.56228053  1.22386702      0  
1  0.16717040  0.12589453 4.85352410 3.56729867 -0.45360592      0  
2 -0.32764183 -0.13909657 4.84874484 3.55684464  2.39256810      0  
3  0.64737603 -0.22192884 4.86090549 3.57384293  1.04322122      0  
4 -0.20600959  0.50229222 4.87699719 3.59528146  0.60044515      0  
In [ ]:
# Verificar cantidad de observaciones por clase
clean_data['Class'].value_counts()
Out[ ]:
count
Class
0 280425
1 473


👁️Observación:

  • Mantener la columna Amount, ya que representa un dato importante para las transacciones y puede ser útil después de una transformación adecuada.

Evaluación Comparativa de Modelos¶

En esta sección, evaluamos diversas técnicas y algoritmos con el objetivo de analizar cómo un dataset altamente desbalanceado se comporta en distintos escenarios.

Esta exploración nos permitirá identificar el algoritmo más adecuado. Más adelante en el proyecto, una vez seleccionado el mejor algoritmo/modelo, nos enfocaremos en su optimización y ajuste fino para alcanzar las mejores métricas de rendimiento.

Modelos Evaluados:
Se evaluan los modelos tanto en datset original (clean_data) cuanto con el balanceado de clases.

  • Regresión Logistica (LogisticRegression)
  • RandomForest (RandomForestClassifier)
  • XGBoost (XGBClassifier)
  • Catboost (CatBoostClassifier)
  • LightGBM (LGBMClassifier)

Ensembles:
Se evaluan los ensembles con y sin aplicacion de técnicas de SMOTE.

  • StackingClassifier
  • VotingClassifier

Balanceado de Clases:

  • Oversampling (SMOTE)
  • Undersampling (RandomUnderSampler)

Validación Cruzada de los Mejores Modelos:

  • Validación Cruzada en dataset original (clean_data)
  • Validación Cruzada con SMOTE

Separa el dataset¶

In [ ]:
%%time

# Separar el dataset en características (X) y variable objetivo (y)
# Eliminamos la columna 'Class' de las características
X = clean_data.drop(['Class'], axis=1)
y = clean_data['Class']  # Etiquetamos la variable objetivo
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,
                                                    random_state=42, stratify=y)

# Definir algoritmos a probar
models = {
    "Logistic Regression": LogisticRegression(max_iter=1000, random_state=42),
    "Random Forest": RandomForestClassifier(n_estimators=150, random_state=42),
    "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric='logloss', random_state=42),
    "CatBoost": CatBoostClassifier(verbose=0, random_state=42),
    "LightGBM": LGBMClassifier(force_row_wise=True, random_state=42)
}

print('\nTamaños set de entrenamiento:', X_train.shape, y_train.shape)
# Proporción de clases en el conjunto de entrenamiento
print("\nDistribución en y_train:")
print(y_train.value_counts())

print('\nTamaño set de prueba:', X_test.shape, y_test.shape)
# Proporción de clases en el conjunto de prueba
print("\nDistribución en y_test:")
print(y_test.value_counts())
Tamaños set de entrenamiento: (224718, 29) (224718,)

Distribución en y_train:
Class
0    224340
1       378
Name: count, dtype: int64

Tamaño set de prueba: (56180, 29) (56180,)

Distribución en y_test:
Class
0    56085
1       95
Name: count, dtype: int64
CPU times: user 212 ms, sys: 17 ms, total: 229 ms
Wall time: 232 ms

Técnicas¶

El uso de técnicas permite:

  • Comparación Objetiva: Evaluar cómo cada técnica (balanceo, reducción de dimensionalidad, validación cruzada, ensembles) influye en el rendimiento de los algoritmos.

  • Identificación de la Mejor Configuración: Identificar el mejor enfoque de preprocesamiento y balanceo para optimizar la detección de fraudes.

  • Flexibilidad: Adaptar el proyecto a diferentes escenarios o datasets, asegurando una solución escalable y reproducible.

Tecnica 1: Entrenamiento sobre el Dataset Original

Descripción:
En este tecnica usamos los datos originales (clean_data) sin aplicar reducción de dimensionalidad ni técnicas de balanceo.

Objetivo:
Servir como baseline para comparar el impacto en el balanceo de clases.

In [ ]:
%%time

tecnica_name = "Tecnica 1"

# Evaluar cada modelo y consolidar métricas
results_tecnica_1 = pd.DataFrame()  # DataFrame para consolidar resultados globales

for name, model in models.items():
    print(f"\n=== Entrenando y evaluando modelo: {name} ===")

    # Medir el tiempo de inicio
    start_time = time.time()

    # Entrenar modelo
    model.fit(X_train, y_train)
    y_pred = model.predict(X_test)
    y_proba = model.predict_proba(X_test)[:, 1]

    # Medir el tiempo de fin
    end_time = time.time()
    execution_time = end_time - start_time  # Tiempo en segundos

    # Calcular métricas actualizadas y convertir a porcentaje
    auprc = average_precision_score(y_test, y_proba) * 100
    f1 = f1_score(y_test, y_pred) * 100
    mcc = matthews_corrcoef(y_test, y_pred) * 100
    balanced_acc = balanced_accuracy_score(y_test, y_pred) * 100
    accuracy = accuracy_score(y_test, y_pred) * 100
    precision = precision_score(y_test, y_pred) * 100
    recall = recall_score(y_test, y_pred) * 100

    # Consolidar métricas globales en un DataFrame
    results_tecnica_1 = pd.concat([results_tecnica_1, pd.DataFrame({
        "Modelo": [name],
        "AUPRC": [auprc],
        "Recall": [recall],
        "Balanced Accuracy": [balanced_acc],
        "F1-Score": [f1],
        "MCC": [mcc],
        "Accuracy (%)": [accuracy],
        "Precision": [precision],

        "Execution Time (s)": [execution_time],  # Agregar tiempo de ejecución
        "Tecnica":[tecnica_name]
    })], ignore_index=True)

    # Imprimir clasificación por clase
    print(f"\n=== Métricas por clase para {name} ===")
    print(classification_report(y_test, y_pred, zero_division=0))

print(f"\n=== Finalizada Validación Cruzada ===\n")

# Mostrar consolidado de resultados
results_tecnica_1 = results_tecnica_1.sort_values(by=["AUPRC"], ascending=False, ignore_index=True)
print("\n=== Resultados Consolidados y Ordenados para el Tecnica 1 ===")
print(results_tecnica_1)
=== Entrenando y evaluando modelo: Logistic Regression ===

=== Métricas por clase para Logistic Regression ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.87      0.62      0.72        95

    accuracy                           1.00     56180
   macro avg       0.93      0.81      0.86     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: Random Forest ===

=== Métricas por clase para Random Forest ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.94      0.77      0.84        95

    accuracy                           1.00     56180
   macro avg       0.97      0.88      0.92     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: XGBoost ===

=== Métricas por clase para XGBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.94      0.79      0.86        95

    accuracy                           1.00     56180
   macro avg       0.97      0.89      0.93     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: CatBoost ===

=== Métricas por clase para CatBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.95      0.80      0.87        95

    accuracy                           1.00     56180
   macro avg       0.97      0.90      0.93     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: LightGBM ===
[LightGBM] [Info] Number of positive: 378, number of negative: 224340
[LightGBM] [Info] Total Bins 7395
[LightGBM] [Info] Number of data points in the train set: 224718, number of used features: 29
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.001682 -> initscore=-6.386024
[LightGBM] [Info] Start training from score -6.386024

=== Métricas por clase para LightGBM ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.39      0.61      0.47        95

    accuracy                           1.00     56180
   macro avg       0.69      0.80      0.74     56180
weighted avg       1.00      1.00      1.00     56180


=== Finalizada Validación Cruzada ===


=== Resultados Consolidados y Ordenados para el Tecnica 1 ===
                Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
0             CatBoost 82.47349574 80.00000000        89.99643398 86.85714286   
1              XGBoost 82.38401153 78.94736842        89.46922669 85.71428571   
2        Random Forest 81.36729394 76.84210526        88.41659511 84.39306358   
3  Logistic Regression 71.80130067 62.10526316        81.04460804 72.39263804   
4             LightGBM 32.41474924 61.05263158        80.44429742 47.34693878   

          MCC  Accuracy (%)   Precision  Execution Time (s)    Tecnica  
0 87.15855237   99.95906016 95.00000000         75.42357993  Tecnica 1  
1 86.00968433   99.95550018 93.75000000          4.07381248  Tecnica 1  
2 84.78080923   99.95194019 93.58974359        511.14025688  Tecnica 1  
3 73.37008307   99.91990032 86.76470588          1.47671819  Tecnica 1  
4 48.48028629   99.77038092 38.66666667          6.99200821  Tecnica 1  
CPU times: user 10min 27s, sys: 4.24 s, total: 10min 31s
Wall time: 10min
Tecnica 2: Balanceado con SMOTE¶

Descripción: Usamos SMOTE para balancear las clases y evaluamos en la predicción de las fraudes.

In [ ]:
%%time

tecnica_name = "Tecnica 2"

print("Distribución de clases antes del SMOTE:\n")
print(y_train.value_counts())

# Aplicar SMOTE solo al conjunto de entrenamiento
print("\nAplicando SMOTE al conjunto de entrenamiento...")
smote = SMOTE(random_state=42)
X_train_smote, y_train_smote = smote.fit_resample(X_train, y_train)

print("\nDistribución de clases después del SMOTE:")
print(pd.Series(y_train_smote).value_counts())

# Evaluar cada modelo y consolidar métricas
results_tecnica_2 = pd.DataFrame() # DataFrame para consolidar resultados globales

for name, model in models.items():
    print(f"\n=== Entrenando y evaluando modelo: {name} ===")

    # Medir el tiempo de inicio
    start_time = time.time()

    # Entrenar modelo
    model.fit(X_train_smote, y_train_smote)
    y_pred = model.predict(X_test)
    y_proba = model.predict_proba(X_test)[:, 1]

    # Medir el tiempo de fin
    end_time = time.time()
    execution_time = end_time - start_time  # Tiempo en segundos

    # Calcular métricas actualizadas y Convertir a porcentaje
    auprc = average_precision_score(y_test, y_proba) * 100
    f1 = f1_score(y_test, y_pred) * 100
    mcc = matthews_corrcoef(y_test, y_pred) * 100
    balanced_acc = balanced_accuracy_score(y_test, y_pred) * 100
    accuracy = accuracy_score(y_test, y_pred) * 100
    precision = precision_score(y_test, y_pred) * 100
    recall = recall_score(y_test, y_pred) * 100

    # Consolidar métricas globales en un DataFrame
    results_tecnica_2 = pd.concat([results_tecnica_2, pd.DataFrame({
        "Modelo": [name],
        "AUPRC": [auprc],
        "Recall": [recall],
        "Balanced Accuracy": [balanced_acc],
        "F1-Score": [f1],
        "MCC": [mcc],
        "Accuracy (%)": [accuracy],
        "Precision": [precision],

        "Execution Time (s)": [execution_time],  # Agregar tiempo de ejecución
        "Tecnica": [tecnica_name]
    })], ignore_index=True)

    # Imprimir clasificación por clase
    print(f"\n=== Métricas por clase para {name} ===")
    print(classification_report(y_test, y_pred, zero_division=0))

print(f"\n=== Finalizada Validación Cruzada ===\n")

# Mostrar consolidado de resultados
results_tecnica_2 = results_tecnica_2.sort_values(
    by=["AUPRC"], ascending=False, ignore_index=True
)
print("\n=== Resultados Consolidados y Ordenados para el Tecnica 2 ===")
print(results_tecnica_2)
Distribución de clases antes del SMOTE:

Class
0    224340
1       378
Name: count, dtype: int64

Aplicando SMOTE al conjunto de entrenamiento...

Distribución de clases después del SMOTE:
Class
0    224340
1    224340
Name: count, dtype: int64

=== Entrenando y evaluando modelo: Logistic Regression ===

=== Métricas por clase para Logistic Regression ===
              precision    recall  f1-score   support

           0       1.00      0.98      0.99     56085
           1       0.06      0.91      0.12        95

    accuracy                           0.98     56180
   macro avg       0.53      0.94      0.55     56180
weighted avg       1.00      0.98      0.99     56180


=== Entrenando y evaluando modelo: Random Forest ===

=== Métricas por clase para Random Forest ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.86      0.81      0.83        95

    accuracy                           1.00     56180
   macro avg       0.93      0.91      0.92     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: XGBoost ===

=== Métricas por clase para XGBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.74      0.83      0.78        95

    accuracy                           1.00     56180
   macro avg       0.87      0.92      0.89     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: CatBoost ===

=== Métricas por clase para CatBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.63      0.83      0.72        95

    accuracy                           1.00     56180
   macro avg       0.82      0.92      0.86     56180
weighted avg       1.00      1.00      1.00     56180


=== Entrenando y evaluando modelo: LightGBM ===
[LightGBM] [Info] Number of positive: 224340, number of negative: 224340
[LightGBM] [Info] Total Bins 7395
[LightGBM] [Info] Number of data points in the train set: 448680, number of used features: 29
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000

=== Métricas por clase para LightGBM ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     56085
           1       0.53      0.83      0.64        95

    accuracy                           1.00     56180
   macro avg       0.76      0.92      0.82     56180
weighted avg       1.00      1.00      1.00     56180


=== Finalizada Validación Cruzada ===


=== Resultados Consolidados y Ordenados para el Tecnica 2 ===
                Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
0              XGBoost 84.23706455 83.15789474        91.55398526 78.21782178   
1             CatBoost 82.28680058 83.15789474        91.53793819 71.81818182   
2        Random Forest 81.82469129 81.05263158        90.51472624 83.24324324   
3             LightGBM 78.66658925 83.15789474        91.51565059 64.48979592   
4  Logistic Regression 70.97166654 90.52631579        94.09261319 11.51271754   

          MCC  Accuracy (%)   Precision  Execution Time (s)    Tecnica  
0 78.31754347   99.92168031 73.83177570         11.18287110  Tecnica 2  
1 72.44324757   99.88964044 63.20000000        132.11596417  Tecnica 2  
2 83.24613795   99.94482022 85.55555556        817.67223072  Tecnica 2  
3 66.11059529   99.84514062 52.66666667         11.16479063  Tecnica 2  
4 23.25183132   97.64684941  6.14724803         38.16073847  Tecnica 2  
CPU times: user 18min 21s, sys: 17.2 s, total: 18min 38s
Wall time: 16min 52s
Tecnica 3: Balanceado con RandomUnderSampler¶

Descripción: Aplicamos RandomUnderSampler para balancear las clases y evaluamos su impacto en la predicción de las fraudes.

El método RandomUnderSampler de la librería imbalanced-learn submuestrea la clase mayoritaria para igualar el número de registros de la clase minoritaria.

In [ ]:
%%time

tecnica_name = "Tecnica 3"

# Distribución de clases antes del RandomUnderSampler
print("Distribución de clases antes del RandomUnderSampler:")
print(y_train.value_counts())

# Aplicar RandomUnderSampler
print("\nAplicando RandomUnderSampler...")
rus = RandomUnderSampler(random_state=42)
X_rus, y_rus = rus.fit_resample(X_train, y_train)

# Mostrar distribución después del balanceo
print("\nDistribución de clases después del RandomUnderSampler:")
print(pd.Series(y_rus).value_counts())

# Evaluar cada modelo y consolidar métricas
results_tecnica_3 = pd.DataFrame()

for name, model in models.items():
    print(f"\n=== Entrenando y evaluando modelo: {name} ===")

    # Ajuste específico para LightGBM si hay pocos datos
    if name == "LightGBM":
        model = LGBMClassifier(
            random_state=42,
            min_child_samples=10,    # Reemplaza min_data_in_leaf para evitar advertencias
            max_bin=128,             # Reducir número de bins
            num_leaves=15,           # Limitar el número de hojas
            force_col_wise=True      # Forzar paralelización por columnas para evitar overhead
        )

    # Medir el tiempo de inicio
    start_time = time.time()

    # Entrenar modelo con los datos balanceados
    model.fit(X_rus, y_rus)

    # Predicciones y probabilidades
    y_pred = model.predict(X_test)
    y_proba = model.predict_proba(X_test)[:, 1]

    # Medir el tiempo de fin
    end_time = time.time()
    execution_time = end_time - start_time

    # Calcular métricas actualizadas y convertir a porcentaje
    auprc = average_precision_score(y_test, y_proba) * 100
    f1 = f1_score(y_test, y_pred) * 100
    mcc = matthews_corrcoef(y_test, y_pred) * 100
    balanced_acc = balanced_accuracy_score(y_test, y_pred) * 100
    accuracy = accuracy_score(y_test, y_pred) * 100
    precision = precision_score(y_test, y_pred, zero_division=0) * 100
    recall = recall_score(y_test, y_pred) * 100

    # Consolidar métricas globales en un DataFrame
    results_tecnica_3 = pd.concat([results_tecnica_3, pd.DataFrame({
        "Modelo": [name],
        "AUPRC": [auprc],
        "Recall": [recall],
        "Balanced Accuracy": [balanced_acc],
        "F1-Score": [f1],
        "MCC": [mcc],
        "Accuracy (%)": [accuracy],
        "Precision": [precision],
        "Execution Time (s)": [execution_time],
        "Tecnica": [tecnica_name]
    })], ignore_index=True)

    # Imprimir clasificación por clase
    print(f"\n=== Métricas por clase para {name} ===")
    print(classification_report(y_test, y_pred, zero_division=0))

# Mostrar consolidado de resultados
results_tecnica_3 = results_tecnica_3.sort_values(by=["AUPRC"],
                                                    ascending=False,
                                                    ignore_index=True)
print("\n=== Resultados Consolidados y Ordenados para el Tecnica 3 ===")
print(results_tecnica_3)
Distribución de clases antes del RandomUnderSampler:
Class
0    224340
1       378
Name: count, dtype: int64

Aplicando RandomUnderSampler...

Distribución de clases después del RandomUnderSampler:
Class
0    378
1    378
Name: count, dtype: int64

=== Entrenando y evaluando modelo: Logistic Regression ===

=== Métricas por clase para Logistic Regression ===
              precision    recall  f1-score   support

           0       1.00      0.97      0.98     56085
           1       0.04      0.88      0.08        95

    accuracy                           0.97     56180
   macro avg       0.52      0.93      0.53     56180
weighted avg       1.00      0.97      0.98     56180


=== Entrenando y evaluando modelo: Random Forest ===

=== Métricas por clase para Random Forest ===
              precision    recall  f1-score   support

           0       1.00      0.98      0.99     56085
           1       0.06      0.89      0.11        95

    accuracy                           0.98     56180
   macro avg       0.53      0.94      0.55     56180
weighted avg       1.00      0.98      0.99     56180


=== Entrenando y evaluando modelo: XGBoost ===

=== Métricas por clase para XGBoost ===
              precision    recall  f1-score   support

           0       1.00      0.96      0.98     56085
           1       0.04      0.94      0.08        95

    accuracy                           0.96     56180
   macro avg       0.52      0.95      0.53     56180
weighted avg       1.00      0.96      0.98     56180


=== Entrenando y evaluando modelo: CatBoost ===

=== Métricas por clase para CatBoost ===
              precision    recall  f1-score   support

           0       1.00      0.98      0.99     56085
           1       0.07      0.89      0.12        95

    accuracy                           0.98     56180
   macro avg       0.53      0.94      0.56     56180
weighted avg       1.00      0.98      0.99     56180


=== Entrenando y evaluando modelo: LightGBM ===
[LightGBM] [Info] Number of positive: 378, number of negative: 378
[LightGBM] [Info] Total Bins 3712
[LightGBM] [Info] Number of data points in the train set: 756, number of used features: 29
[LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000

=== Métricas por clase para LightGBM ===
              precision    recall  f1-score   support

           0       1.00      0.97      0.98     56085
           1       0.05      0.91      0.09        95

    accuracy                           0.97     56180
   macro avg       0.52      0.94      0.54     56180
weighted avg       1.00      0.97      0.98     56180


=== Resultados Consolidados y Ordenados para el Tecnica 3 ===
                Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
0        Random Forest 72.11181402 89.47368421        93.54757581 11.22853369   
1             LightGBM 71.97307909 90.52631579        93.69321941  8.85684861   
2              XGBoost 69.24568170 93.68421053        94.96994693  7.79334501   
3  Logistic Regression 66.84279885 88.42105263        92.54163089  8.19112628   
4             CatBoost 65.41504902 89.47368421        93.68308442 12.48164464   

          MCC  Accuracy (%)  Precision  Execution Time (s)    Tecnica  
0 22.80616497   97.60768957 5.99013390          2.10636878  Tecnica 3  
1 20.13556942   96.84941260 4.65619924          0.85839844  Tecnica 3  
2 19.09650843   96.25133499 4.06578346          0.96711493  Tecnica 3  
3 19.06992981   96.64827341 4.29447853          0.31764269  Tecnica 3  
4 24.17700852   97.87824849 6.70876085          7.46699548  Tecnica 3  
CPU times: user 19.7 s, sys: 631 ms, total: 20.3 s
Wall time: 13.1 s
Tecnica 4: Validación Cruzada de Mejores Modelos (Balanceado con SMOTE)¶

Descripción: Ejecutamos la Validación Cruzada Estratificada (CrossValidation) con StratifiedKFold para los modelos seleccionados y usamos el método SMOTE para balancear las clases.

In [ ]:
# Validación Cruzada - Balanceado con SMOTE (Oversampling)
%%time

tecnica_name = "Tecnica 4"

# Modelos seleccionados con parámetros específicos para validación cruzada
selected_models_cv = {
    "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric='logloss',
                             tree_method='hist', random_state=42),
    "CatBoost": CatBoostClassifier(verbose=0, thread_count=-1, random_state=42)
}

# Configuración de validación cruzada
cv = StratifiedKFold(n_splits=5, shuffle=True, random_state=42)

# DataFrame para consolidar resultados globales
results_tecnica_4 = pd.DataFrame()

print(f"\n=== Aguarde mientras se ejecuta la Validación Cruzada... ===")
# Validación cruzada para cada modelo
for name, model in selected_models_cv.items():
    total_execution_time = 0

    # Listas para guardar métricas por pliegue
    scores_auprc, scores_recall, scores_f1 = [], [], []
    scores_precision, scores_balanced_acc, scores_mcc = [], [], []
    scores_accuracy = []

    # Validación cruzada
    for train_idx, test_idx in cv.split(X_train, y_train):
        # Dividir datos en entrenamiento y prueba
        X_train_fold, y_train_fold = X_train.iloc[train_idx], y_train.iloc[train_idx]
        X_test_fold, y_test_fold = X_train.iloc[test_idx], y_train.iloc[test_idx]

        # Aplicar SMOTE al conjunto de entrenamiento
        smote = SMOTE(random_state=42)
        X_train_smote, y_train_smote = smote.fit_resample(X_train_fold, y_train_fold)

        # Medir tiempo de inicio
        start_time = time.time()

        # Entrenar modelo
        model_clone = model.fit(X_train_smote, y_train_smote)

        # Medir tiempo de fin
        end_time = time.time()
        total_execution_time += (end_time - start_time)

        # Predicciones y probabilidades
        y_pred = model_clone.predict(X_test_fold)
        y_proba = model_clone.predict_proba(X_test_fold)[:, 1]

        # Calcular métricas
        scores_auprc.append(average_precision_score(y_test_fold, y_proba) * 100)
        scores_recall.append(recall_score(y_test_fold, y_pred) * 100)
        scores_balanced_acc.append(balanced_accuracy_score(y_test_fold, y_pred) * 100)
        scores_f1.append(f1_score(y_test_fold, y_pred) * 100)
        scores_precision.append(precision_score(y_test_fold, y_pred, zero_division=0) * 100)
        scores_mcc.append(matthews_corrcoef(y_test_fold, y_pred) * 100)
        scores_accuracy.append(accuracy_score(y_test_fold, y_pred) * 100)


    print(f"\n=== Consolidando métricas promedio para el Modelo: {name} ===")
    # Consolidar métricas promedio para el modelo actual
    results_tecnica_4 = pd.concat([results_tecnica_4, pd.DataFrame({
        "Modelo": [name],
        "AUPRC": [sum(scores_auprc) / len(scores_auprc)],
        "Recall": [sum(scores_recall) / len(scores_recall)],
        "Balanced Accuracy": [sum(scores_balanced_acc) / len(scores_balanced_acc)],
        "F1-Score": [sum(scores_f1) / len(scores_f1)],
        "MCC": [sum(scores_mcc) / len(scores_mcc)],
        "Accuracy (%)": [sum(scores_accuracy) / len(scores_accuracy)],
        "Precision": [sum(scores_precision) / len(scores_precision)],
        "Execution Time (s)": [total_execution_time],
        "Tecnica": [tecnica_name]
    })], ignore_index=True)


    # Imprimir clasificación por clase
    print(f"\n=== Métricas por clase para {name} ===")
    print(classification_report(y_test_fold, y_pred, zero_division=0))

print(f"\n=== Finalizada Validación Cruzada ===")

# Ordenar resultados por AUPRC, Recall y Balanced Accuracy
results_tecnica_4 = results_tecnica_4.sort_values(
    by=["AUPRC", "Recall", "Balanced Accuracy"],
    ascending=False
)

# Mostrar resultados
print("\nResultados de Validación Cruzada con SMOTE:")
print(results_tecnica_4)
=== Aguarde mientras se ejecuta la Validación Cruzada... ===

=== Consolidando métricas promedio para el Modelo: XGBoost ===

=== Métricas por clase para XGBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     44868
           1       0.83      0.84      0.83        75

    accuracy                           1.00     44943
   macro avg       0.91      0.92      0.92     44943
weighted avg       1.00      1.00      1.00     44943


=== Consolidando métricas promedio para el Modelo: CatBoost ===

=== Métricas por clase para CatBoost ===
              precision    recall  f1-score   support

           0       1.00      1.00      1.00     44868
           1       0.66      0.84      0.74        75

    accuracy                           1.00     44943
   macro avg       0.83      0.92      0.87     44943
weighted avg       1.00      1.00      1.00     44943


=== Finalizada Validación Cruzada ===

Resultados de Validación Cruzada con SMOTE:
     Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
0   XGBoost 84.02092021 82.55087719        91.25493401 79.88193001   
1  CatBoost 82.94613490 82.54385965        91.23203502 71.93298517   

          MCC  Accuracy (%)   Precision  Execution Time (s)    Tecnica  
0 79.95588323   99.92968975 77.64867957         44.32414532  Tecnica 4  
1 72.51650476   99.89097459 63.87698830        519.03088927  Tecnica 4  
CPU times: user 14min 54s, sys: 13.5 s, total: 15min 7s
Wall time: 9min 30s
Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)¶

Descripción: Ejecutamos la validación Cruzada Estratificada (CrossValidation) con StratifiedKFold para los modelos seleccionados. En este tecnica no hacemos el balanceo de clases, trabajamos con el dataset original (clean_data).

In [ ]:
# Validación Cruzada con dataset original (SIN SMOTE)

%%time

tecnica_name = "Tecnica 5"

# Modelos seleccionados con parámetros específicos para validación cruzada
selected_models_cv = {
    "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric='logloss',
                             tree_method='hist', random_state=42),
    "CatBoost": CatBoostClassifier(verbose=0, thread_count=-1, random_state=42)
}

# Configuración de validación cruzada
cv = StratifiedKFold(n_splits=5, shuffle=True, random_state=42)

# DataFrame para consolidar resultados globales
results_tecnica_5 = pd.DataFrame()

print(f"\n=== Aguarde mientras se ejecuta la Validación Cruzada... ===")
# Validación cruzada para cada modelo
for name, model in selected_models_cv.items():
    total_execution_time = 0

    # Listas para guardar métricas por pliegue
    scores_auprc, scores_recall, scores_f1 = [], [], []
    scores_precision, scores_balanced_acc, scores_mcc = [], [], []
    scores_accuracy = []

    # Validación cruzada
    for train_idx, test_idx in cv.split(X_train, y_train):
        # Dividir datos en entrenamiento y prueba
        X_train_fold, y_train_fold = X_train.iloc[train_idx], y_train.iloc[train_idx]
        X_test_fold, y_test_fold = X_train.iloc[test_idx], y_train.iloc[test_idx]

        # Medir tiempo de inicio
        start_time = time.time()

        # Entrenar modelo con datos de entrenamiento del pliegue
        model_clone = model.fit(X_train_fold, y_train_fold)

        # Medir tiempo de fin
        end_time = time.time()
        total_execution_time += (end_time - start_time)

        # Predicciones y probabilidades
        y_pred = model_clone.predict(X_test_fold)
        y_proba = model_clone.predict_proba(X_test_fold)[:, 1]

        # Calcular métricas
        scores_auprc.append(average_precision_score(y_test_fold, y_proba) * 100)
        scores_recall.append(recall_score(y_test_fold, y_pred) * 100)
        scores_balanced_acc.append(balanced_accuracy_score(y_test_fold, y_pred) * 100)
        scores_f1.append(f1_score(y_test_fold, y_pred) * 100)
        scores_precision.append(precision_score(y_test_fold, y_pred, zero_division=0) * 100)
        scores_mcc.append(matthews_corrcoef(y_test_fold, y_pred) * 100)
        scores_accuracy.append(accuracy_score(y_test_fold, y_pred) * 100)


    print(f"\n=== Consolidando métricas promedio para el Modelo: {name} ===")
    # Consolidar métricas promedio para el modelo actual
    results_tecnica_5 = pd.concat([results_tecnica_5, pd.DataFrame({
        "Modelo": [name],
        "AUPRC": [sum(scores_auprc) / len(scores_auprc)],
        "Recall": [sum(scores_recall) / len(scores_recall)],
        "Balanced Accuracy": [sum(scores_balanced_acc) / len(scores_balanced_acc)],
        "F1-Score": [sum(scores_f1) / len(scores_f1)],
        "MCC": [sum(scores_mcc) / len(scores_mcc)],
        "Accuracy (%)": [sum(scores_accuracy) / len(scores_accuracy)],
        "Precision": [sum(scores_precision) / len(scores_precision)],
        "Execution Time (s)": [total_execution_time],
        "Tecnica": [tecnica_name]
    })], ignore_index=True)

print(f"\n=== Finalizada Validación Cruzada ===")

# Ordenar resultados por AUPRC, Recall y Balanced Accuracy
results_tecnica_5 = results_tecnica_5.sort_values(
    by=["AUPRC", "Recall", "Balanced Accuracy"],
    ascending=False
)

# Mostrar resultados
print("\nResultados de Validación Cruzada sin SMOTE:")
print(results_tecnica_5)
=== Aguarde mientras se ejecuta la Validación Cruzada... ===

=== Consolidando métricas promedio para el Modelo: XGBoost ===

=== Consolidando métricas promedio para el Modelo: CatBoost ===

=== Finalizada Validación Cruzada ===

Resultados de Validación Cruzada sin SMOTE:
     Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
0   XGBoost 84.77061049 77.25263158        88.62230402 84.85060860   
1  CatBoost 84.27051518 79.37192982        89.68284465 86.68457180   

          MCC  Accuracy (%)   Precision  Execution Time (s)    Tecnica  
0 85.28456551   99.95371979 94.27725659         18.68230915  Tecnica 5  
1 87.07311562   99.95905982 95.63294319        302.70576501  Tecnica 5  
CPU times: user 8min 25s, sys: 12.3 s, total: 8min 37s
Wall time: 5min 24s
Tecnica 6: Ensembles con SMOTE¶

Descripción:

  • Implementamos ensembles como VotingClassifier y StackingClassifier con uso de banlaceado de clases.
  • Evalúamos su rendimiento comparado con los modelos individuales.
  • Los modelos seleccionados elegidos para esta sección, han sido los que han obtenido mejores métricas en los técnicas (1, 2 y 3).
In [ ]:
%%time

tecnica_name = "Tecnica 6"

# Definir modelos base
models = {
    "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric="logloss", random_state=42),
    "CatBoost": CatBoostClassifier(verbose=0, random_state=42)
}

# Validar claves del diccionario models
print("Modelos disponibles en 'models':", models.keys())

# Configurar VotingClassifier
voting_clf = VotingClassifier(
    estimators=[
        ("XGBoost", models["XGBoost"]),
        ("CatBoost", models["CatBoost"])
    ],
    voting="soft"
)

# Configurar StackingClassifier
stacking_clf = StackingClassifier(
    estimators=[
        ("XGBoost", models["XGBoost"]),
        ("CatBoost", models["CatBoost"])
    ],
    final_estimator=LogisticRegression(random_state=42)
)

# Mostrar distribución antes del SMOTE
print("\nDistribución de clases antes del SMOTE:")
print(y_train.value_counts())

# Aplicar SMOTE en el set de entrenamiento
print("\nAplicando SMOTE en el set de entrenamiento...")
smote = SMOTE(random_state=42)
X_train_smote, y_train_smote = smote.fit_resample(X_train, y_train)

print("\nDistribución de clases después del SMOTE:")
print(pd.Series(y_train_smote).value_counts())

# Evaluar ensembles
ensembles = {
    "VotingClassifier": voting_clf,
    "StackingClassifier": stacking_clf
}

results_tecnica_6 = []
print("\n=== Iniciando evaluación de ensembles ===")

for ensemble_name, ensemble in ensembles.items():
    print(f"\n=== Entrenando y evaluando ensemble: {ensemble_name} ===")
    base_models = ", ".join([estimator[0] for estimator in ensemble.estimators])

    start_time = time.time()
    ensemble.fit(X_train_smote, y_train_smote)
    end_time = time.time()
    execution_time = end_time - start_time

    y_pred = ensemble.predict(X_test)
    y_proba = ensemble.predict_proba(X_test)[:, 1]

    # Calcular métricas
    auprc = average_precision_score(y_test, y_proba)
    recall = recall_score(y_test, y_pred)
    balanced_acc = balanced_accuracy_score(y_test, y_pred)
    f1 = f1_score(y_test, y_pred)
    mcc = matthews_corrcoef(y_test, y_pred)
    accuracy = accuracy_score(y_test, y_pred)
    precision = precision_score(y_test, y_pred, zero_division=0)

    results_tecnica_6.append({
        "Ensemble": ensemble_name,
        "Modelo": base_models,
        "AUPRC": auprc * 100,
        "Recall": recall * 100,
        "Balanced Accuracy": balanced_acc * 100,
        "F1-Score": f1 * 100,
        "MCC": mcc * 100,
        "Accuracy (%)": accuracy * 100,
        "Precision": precision * 100,
        "Execution Time (s)": execution_time,
        "Tecnica": tecnica_name
    })

# Crear DataFrame de resultados
results_tecnica_6 = pd.DataFrame(results_tecnica_6)

# Ordenar por métricas clave
results_tecnica_6 = results_tecnica_6.sort_values(
    by=["AUPRC", "Recall", "Balanced Accuracy"],
    ascending=[False, False, False]
)

# Mostrar resultados
print("\nResultados Ensembles (Ordenados):")
print(results_tecnica_6)
Modelos disponibles en 'models': dict_keys(['XGBoost', 'CatBoost'])

Distribución de clases antes del SMOTE:
Class
0    224340
1       378
Name: count, dtype: int64

Aplicando SMOTE en el set de entrenamiento...

Distribución de clases después del SMOTE:
Class
0    224340
1    224340
Name: count, dtype: int64

=== Iniciando evaluación de ensembles ===

=== Entrenando y evaluando ensemble: VotingClassifier ===

=== Entrenando y evaluando ensemble: StackingClassifier ===

Resultados Ensembles (Ordenados):
             Ensemble             Modelo       AUPRC      Recall  \
1  StackingClassifier  XGBoost, CatBoost 83.75458887 82.10526316   
0    VotingClassifier  XGBoost, CatBoost 83.65617379 83.15789474   

   Balanced Accuracy    F1-Score         MCC  Accuracy (%)   Precision  \
1        91.03123548 79.18781726 79.20150068   99.92702029 76.47058824   
0        91.55131075 77.07317073 77.23928912   99.91634033 71.81818182   

   Execution Time (s)    Tecnica  
1        704.98505759  Tecnica 6  
0        140.36777210  Tecnica 6  
CPU times: user 22min 17s, sys: 19.7 s, total: 22min 37s
Wall time: 14min 7s
Tecnica 7: Ensembles con dataset original (clean_data)¶

Descripción:

  • Implementamos ensembles como VotingClassifier y StackingClassifier.
  • Evalúamos su rendimiento comparado con los modelos individuales.
  • Los modelos seleccionados elegidos para esta sección, han sido los que han obtenido mejores métricas en los técnicas (1, 2 y 3).
In [ ]:
%%time
# con dataset original (SIN SMOTE)

tecnica_name = "Tecnica 7"

# Definir modelos base
models = {
    "XGBoost": XGBClassifier(use_label_encoder=False, eval_metric="logloss", random_state=42),
    "CatBoost": CatBoostClassifier(verbose=0, random_state=42)
}

# Configurar VotingClassifier
voting_clf = VotingClassifier(
    estimators=[
        ("XGBoost", models["XGBoost"]),
        ("CatBoost", models["CatBoost"])
    ],
    voting="soft"
)

# Configurar StackingClassifier
stacking_clf = StackingClassifier(
    estimators=[
        ("XGBoost", models["XGBoost"]),
        ("CatBoost", models["CatBoost"])
    ],
    final_estimator=LogisticRegression(random_state=42)
)

# Evaluar ensembles
ensembles = {
    "VotingClassifier": voting_clf,
    "StackingClassifier": stacking_clf
}

# Lista para almacenar resultados
results_tecnica_7 = []
print("\n=== Iniciando evaluación de ensembles ===")

for ensemble_name, ensemble in ensembles.items():
    print(f"\n=== Entrenando y evaluando ensemble: {ensemble_name} ===")

    # Extraer nombres de los modelos base
    base_models = ", ".join(
        [estimator[0] for estimator in getattr(ensemble, 'estimators_', ensemble.estimators)]
    )

    # Medir tiempo de inicio
    start_time = time.time()

    # Entrenar ensemble
    ensemble.fit(X_train, y_train)

    # Medir tiempo de fin
    end_time = time.time()
    execution_time = end_time - start_time

    # Predicciones y probabilidades
    y_pred = ensemble.predict(X_test)
    y_proba = ensemble.predict_proba(X_test)[:, 1]

    # Calcular métricas
    auprc = average_precision_score(y_test, y_proba)
    recall = recall_score(y_test, y_pred)
    balanced_acc = balanced_accuracy_score(y_test, y_pred)
    f1 = f1_score(y_test, y_pred)
    mcc = matthews_corrcoef(y_test, y_pred)
    accuracy = accuracy_score(y_test, y_pred)
    precision = precision_score(y_test, y_pred, zero_division=0)

    # Guardar resultados en el orden especificado
    results_tecnica_7.append({
        "Ensemble": ensemble_name,
        "Modelo": base_models,
        "AUPRC": auprc * 100,
        "Recall": recall * 100,
        "Balanced Accuracy": balanced_acc * 100,
        "F1-Score": f1 * 100,
        "MCC": mcc * 100,
        "Accuracy (%)": accuracy * 100,
        "Precision": precision * 100,
        "Execution Time (s)": execution_time,
        "Tecnica": tecnica_name
    })

# Crear un DataFrame con los resultados
results_tecnica_7 = pd.DataFrame(results_tecnica_7)

# Ordenar por AUPRC, luego por Recall, y demás métricas según prioridad
results_tecnica_7 = results_tecnica_7.sort_values(
    by=["AUPRC", "Recall", "Balanced Accuracy"],
    ascending=[False, False, False]
).reset_index(drop=True)

# Mostrar resultados
print("\nResultados Ensembles (Ordenados):")
print(results_tecnica_7)
=== Iniciando evaluación de ensembles ===

=== Entrenando y evaluando ensemble: VotingClassifier ===

=== Entrenando y evaluando ensemble: StackingClassifier ===

Resultados Ensembles (Ordenados):
             Ensemble             Modelo       AUPRC      Recall  \
0    VotingClassifier  XGBoost, CatBoost 82.59615672 80.00000000   
1  StackingClassifier  XGBoost, CatBoost 82.59591955 76.84210526   

   Balanced Accuracy    F1-Score         MCC  Accuracy (%)   Precision  \
0        89.99643398 86.85714286 87.15855237   99.95906016 95.00000000   
1        88.41748662 84.88372093 85.33077774   99.95372019 94.80519481   

   Execution Time (s)    Tecnica  
0         79.35498047  Tecnica 7  
1        394.42615819  Tecnica 7  
CPU times: user 12min 24s, sys: 17.8 s, total: 12min 42s
Wall time: 7min 54s

Tabla Resumen de Resultados¶

En esta sección, se consolida las métricas de todos los técnicas ejecutados en el proyecto y se los ordena por las métricas "AUPRC" y "Recall", de mayor a menor.

In [ ]:
%%time
# Consolidar resultados finales
final_results = pd.concat([results_tecnica_1, results_tecnica_2,
                           results_tecnica_3, results_tecnica_4,
                           results_tecnica_5, results_tecnica_6,
                           results_tecnica_7],ignore_index=True)

final_results = final_results.sort_values(by=["AUPRC", "Recall"],
                                          ascending=[False, False])

# Mostrar tabla resumen
print("Tabla Resumen de Resultados Finales:")
print(final_results)
Tabla Resumen de Resultados Finales:
                 Modelo       AUPRC      Recall  Balanced Accuracy  \
17              XGBoost 84.77061049 77.25263158        88.62230402   
18             CatBoost 84.27051518 79.37192982        89.68284465   
5               XGBoost 84.23706455 83.15789474        91.55398526   
15              XGBoost 84.02092021 82.55087719        91.25493401   
19    XGBoost, CatBoost 83.75458887 82.10526316        91.03123548   
20    XGBoost, CatBoost 83.65617379 83.15789474        91.55131075   
16             CatBoost 82.94613490 82.54385965        91.23203502   
21    XGBoost, CatBoost 82.59615672 80.00000000        89.99643398   
22    XGBoost, CatBoost 82.59591955 76.84210526        88.41748662   
0              CatBoost 82.47349574 80.00000000        89.99643398   
1               XGBoost 82.38401153 78.94736842        89.46922669   
6              CatBoost 82.28680058 83.15789474        91.53793819   
7         Random Forest 81.82469129 81.05263158        90.51472624   
2         Random Forest 81.36729394 76.84210526        88.41659511   
8              LightGBM 78.66658925 83.15789474        91.51565059   
10        Random Forest 72.11181402 89.47368421        93.54757581   
11             LightGBM 71.97307909 90.52631579        93.69321941   
3   Logistic Regression 71.80130067 62.10526316        81.04460804   
9   Logistic Regression 70.97166654 90.52631579        94.09261319   
12              XGBoost 69.24568170 93.68421053        94.96994693   
13  Logistic Regression 66.84279885 88.42105263        92.54163089   
14             CatBoost 65.41504902 89.47368421        93.68308442   
4              LightGBM 32.41474924 61.05263158        80.44429742   

      F1-Score         MCC  Accuracy (%)   Precision  Execution Time (s)  \
17 84.85060860 85.28456551   99.95371979 94.27725659         18.68230915   
18 86.68457180 87.07311562   99.95905982 95.63294319        302.70576501   
5  78.21782178 78.31754347   99.92168031 73.83177570         11.18287110   
15 79.88193001 79.95588323   99.92968975 77.64867957         44.32414532   
19 79.18781726 79.20150068   99.92702029 76.47058824        704.98505759   
20 77.07317073 77.23928912   99.91634033 71.81818182        140.36777210   
16 71.93298517 72.51650476   99.89097459 63.87698830        519.03088927   
21 86.85714286 87.15855237   99.95906016 95.00000000         79.35498047   
22 84.88372093 85.33077774   99.95372019 94.80519481        394.42615819   
0  86.85714286 87.15855237   99.95906016 95.00000000         75.42357993   
1  85.71428571 86.00968433   99.95550018 93.75000000          4.07381248   
6  71.81818182 72.44324757   99.88964044 63.20000000        132.11596417   
7  83.24324324 83.24613795   99.94482022 85.55555556        817.67223072   
2  84.39306358 84.78080923   99.95194019 93.58974359        511.14025688   
8  64.48979592 66.11059529   99.84514062 52.66666667         11.16479063   
10 11.22853369 22.80616497   97.60768957  5.99013390          2.10636878   
11  8.85684861 20.13556942   96.84941260  4.65619924          0.85839844   
3  72.39263804 73.37008307   99.91990032 86.76470588          1.47671819   
9  11.51271754 23.25183132   97.64684941  6.14724803         38.16073847   
12  7.79334501 19.09650843   96.25133499  4.06578346          0.96711493   
13  8.19112628 19.06992981   96.64827341  4.29447853          0.31764269   
14 12.48164464 24.17700852   97.87824849  6.70876085          7.46699548   
4  47.34693878 48.48028629   99.77038092 38.66666667          6.99200821   

      Tecnica            Ensemble  
17  Tecnica 5                 NaN  
18  Tecnica 5                 NaN  
5   Tecnica 2                 NaN  
15  Tecnica 4                 NaN  
19  Tecnica 6  StackingClassifier  
20  Tecnica 6    VotingClassifier  
16  Tecnica 4                 NaN  
21  Tecnica 7    VotingClassifier  
22  Tecnica 7  StackingClassifier  
0   Tecnica 1                 NaN  
1   Tecnica 1                 NaN  
6   Tecnica 2                 NaN  
7   Tecnica 2                 NaN  
2   Tecnica 1                 NaN  
8   Tecnica 2                 NaN  
10  Tecnica 3                 NaN  
11  Tecnica 3                 NaN  
3   Tecnica 1                 NaN  
9   Tecnica 2                 NaN  
12  Tecnica 3                 NaN  
13  Tecnica 3                 NaN  
14  Tecnica 3                 NaN  
4   Tecnica 1                 NaN  
CPU times: user 9.8 ms, sys: 0 ns, total: 9.8 ms
Wall time: 9.52 ms

Evaluar el impacto del tiempo de procesamiento con SMOTE¶

En esta sección, se evaluan los modelos que usaron SMOTE y que han mejorado sus métricas en un 5-10% para decidir su utilización en el modelo final (modelo ganador).

La necesidad de evaluar la efitividad y eficiencia de las métricas con está técnica, es porque el tiempo de procesamiento aumenta debido a la necesidad de generar datos sinteticos.

Resultados Consolidados de Modelos con y sin SMOTE

Aspecto Verificación de SMOTE Filtrar por AUPRC > 80%
Propósito principal Evaluar si SMOTE mejora rendimiento y eficiencia. Seleccionar el mejor modelo según AUPRC.
Métrica central Comparación de métricas clave con/sin SMOTE Solo AUPRC (mayor a 80%).
Criterio adicional Impacto en el tiempo de ejecución No aplica
Foco Evaluar técnica (SMOTE) Evaluar modelo
Uso Decidir si SMOTE es útil Seleccionar el mejor modelo para el proyecto
In [ ]:
# ============================================
# Comparar dinámicamente si las métricas mejoran un 5-10 %
# ============================================

# 1. Identificar el baseline (Tecnica 1)
baseline = final_results[final_results["Tecnica"] == "Tecnica 1"].iloc[0]

# 2. Filtrar únicamente las técnicas que usan SMOTE (Tecnica 2, Tecnica 4, Tecnica 6)
smote_techniques = ["Tecnica 2", "Tecnica 4", "Tecnica 6"]
final_results_smote = final_results[final_results["Tecnica"].isin(smote_techniques)]

# 3. Comparar las métricas y tiempos usando el baseline
final_results_smote["AUPRC Improvement (%)"] = (
    (final_results_smote["AUPRC"] - baseline["AUPRC"]) / baseline["AUPRC"] * 100
)
final_results_smote["Recall Improvement (%)"] = (
    (final_results_smote["Recall"] - baseline["Recall"]) / baseline["Recall"] * 100
)
final_results_smote["Balanced Accuracy Improvement (%)"] = (
    (final_results_smote["Balanced Accuracy"] - baseline["Balanced Accuracy"]) / baseline["Balanced Accuracy"] * 100
)
final_results_smote["Execution Time Increase (%)"] = (
    (final_results_smote["Execution Time (s)"] - baseline["Execution Time (s)"]) / baseline["Execution Time (s)"] * 100
)

# 4. Marcar mejoras significativas (>= 5%) y tiempos aceptables (<= 50% más)
final_results_smote["Significant AUPRC Improvement"] = final_results_smote["AUPRC Improvement (%)"] >= 5
final_results_smote["Significant Recall Improvement"] = final_results_smote["Recall Improvement (%)"] >= 5
final_results_smote["Significant Balanced Accuracy Improvement"] = final_results_smote["Balanced Accuracy Improvement (%)"] >= 5
final_results_smote["Acceptable Time Increase"] = final_results_smote["Execution Time Increase (%)"] <= 50

# 5. Ordenar resultados por métricas clave
final_results_sorted = final_results_smote.sort_values(
    by=["AUPRC", "Recall", "Balanced Accuracy"],
    ascending=[False, False, False]
)

# Guardar resultados en un archivo CSV
final_results_sorted.to_csv("Resultados_Comparativos_SMOTE.csv", index=False)

# 6. Filtrar filas donde el uso de SMOTE aporta valor al proyecto
filtered_smote_results = final_results_smote[
    (final_results_smote["Significant AUPRC Improvement"]) |
    (final_results_smote["Significant Recall Improvement"]) |
    (final_results_smote["Significant Balanced Accuracy Improvement"]) &
    (final_results_smote["Acceptable Time Increase"])
]

# 7. Interpretación textual
print("=== Resultados de SMOTE ===")
if filtered_smote_results.empty:
    print("SMOTE no mejoró significativamente el rendimiento ni la eficiencia en los modelos evaluados.")
    print("Se recomienda no utilizar SMOTE para este proyecto.")
else:
    print("SMOTE mejoró el rendimiento en los siguientes casos:")

    # Extraer el mejor modelo con SMOTE basado en AUPRC
    best_smote_model = filtered_smote_results.loc[filtered_smote_results["AUPRC"].idxmax()]

    print(f"- Mejor modelo con SMOTE: {best_smote_model['Modelo']} (Tecnica: {best_smote_model['Tecnica']})")
    print(f"  AUPRC: {best_smote_model['AUPRC']:.2f}")
    print(f"  Recall: {best_smote_model['Recall']:.2f}")
    print(f"  Balanced Accuracy: {best_smote_model['Balanced Accuracy']:.2f}")
    print(f"  Tiempo de ejecución: {best_smote_model['Execution Time (s)']:.2f} segundos")
    print("\nResultados detallados:")
    print(filtered_smote_results)

# Mostrar los resultados filtrados donde SMOTE es beneficioso
#print("\n=== Resultados donde SMOTE es beneficioso ===")
#print(filtered_smote_results)
=== Resultados de SMOTE ===
SMOTE mejoró el rendimiento en los siguientes casos:
- Mejor modelo con SMOTE: Logistic Regression (Tecnica: Tecnica 2)
  AUPRC: 70.97
  Recall: 90.53
  Balanced Accuracy: 94.09
  Tiempo de ejecución: 38.16 segundos

Resultados detallados:
                Modelo       AUPRC      Recall  Balanced Accuracy    F1-Score  \
9  Logistic Regression 70.97166654 90.52631579        94.09261319 11.51271754   

          MCC  Accuracy (%)  Precision  Execution Time (s)    Tecnica  \
9 23.25183132   97.64684941 6.14724803         38.16073847  Tecnica 2   

  Ensemble  AUPRC Improvement (%)  Recall Improvement (%)  \
9      NaN           -13.94609153             13.15789474   

   Balanced Accuracy Improvement (%)  Execution Time Increase (%)  \
9                         4.55149057                 -49.40476373   

   Significant AUPRC Improvement  Significant Recall Improvement  \
9                          False                            True   

   Significant Balanced Accuracy Improvement  Acceptable Time Increase  
9                                      False                      True  

Mejores Modelos con AUPRC > 80%¶

  • En esta sección se presenta solamente los modelos que han obtendido un AUPRC > 80% para facilitar el estudio y la elección del mejor modelo para el proyecto.

  • Los valores estan ordenados por "AUPRC" y "Recall", de mayor a menor.

In [ ]:
# Concatenar Ensemble con Modelo si Ensemble está rellenado y no está repetido
final_results["Modelo"] = final_results.apply(
    lambda row: f"{row['Modelo']} ({row['Ensemble']})"
    if pd.notnull(row['Ensemble']) and row['Ensemble'] != '' and row['Ensemble'] not in row['Modelo']
    else row['Modelo'], axis=1
)

# Filtrar solamente los modelos con AUPRC > 80%
final_results_mejor_modelo = final_results[final_results["AUPRC"] > 80.00][
    ["Modelo", "AUPRC", "Recall", "Balanced Accuracy", "F1-Score", "MCC",
     "Accuracy (%)", "Precision", "Execution Time (s)", "Tecnica"]
]

# Ordenar el dataframe final por AUPRC y Recall para elegir los modelos con
# mejor detección de fraudes
final_results_mejor_modelo = final_results_mejor_modelo.sort_values(by=["AUPRC", "Recall"], ascending=[False, False], ignore_index=True)

# Mostrar tabla resumen
print("Mejores resultados con AUPRC > 80% :")
print(final_results_mejor_modelo)

# Exportar resultados a CSV
final_results_mejor_modelo.to_csv("resultados_finales_mejor_modelo.csv", index=False)
print("Resultados finales guardados como 'resultados_finales_mejor_modelo.csv'")
Mejores resultados con AUPRC > 80% :
                                    Modelo       AUPRC      Recall  \
0                                  XGBoost 84.77061049 77.25263158   
1                                 CatBoost 84.27051518 79.37192982   
2                                  XGBoost 84.23706455 83.15789474   
3                                  XGBoost 84.02092021 82.55087719   
4   XGBoost, CatBoost (StackingClassifier) 83.75458887 82.10526316   
5     XGBoost, CatBoost (VotingClassifier) 83.65617379 83.15789474   
6                                 CatBoost 82.94613490 82.54385965   
7     XGBoost, CatBoost (VotingClassifier) 82.59615672 80.00000000   
8   XGBoost, CatBoost (StackingClassifier) 82.59591955 76.84210526   
9                                 CatBoost 82.47349574 80.00000000   
10                                 XGBoost 82.38401153 78.94736842   
11                                CatBoost 82.28680058 83.15789474   
12                           Random Forest 81.82469129 81.05263158   
13                           Random Forest 81.36729394 76.84210526   

    Balanced Accuracy    F1-Score         MCC  Accuracy (%)   Precision  \
0         88.62230402 84.85060860 85.28456551   99.95371979 94.27725659   
1         89.68284465 86.68457180 87.07311562   99.95905982 95.63294319   
2         91.55398526 78.21782178 78.31754347   99.92168031 73.83177570   
3         91.25493401 79.88193001 79.95588323   99.92968975 77.64867957   
4         91.03123548 79.18781726 79.20150068   99.92702029 76.47058824   
5         91.55131075 77.07317073 77.23928912   99.91634033 71.81818182   
6         91.23203502 71.93298517 72.51650476   99.89097459 63.87698830   
7         89.99643398 86.85714286 87.15855237   99.95906016 95.00000000   
8         88.41748662 84.88372093 85.33077774   99.95372019 94.80519481   
9         89.99643398 86.85714286 87.15855237   99.95906016 95.00000000   
10        89.46922669 85.71428571 86.00968433   99.95550018 93.75000000   
11        91.53793819 71.81818182 72.44324757   99.88964044 63.20000000   
12        90.51472624 83.24324324 83.24613795   99.94482022 85.55555556   
13        88.41659511 84.39306358 84.78080923   99.95194019 93.58974359   

    Execution Time (s)    Tecnica  
0          18.68230915  Tecnica 5  
1         302.70576501  Tecnica 5  
2          11.18287110  Tecnica 2  
3          44.32414532  Tecnica 4  
4         704.98505759  Tecnica 6  
5         140.36777210  Tecnica 6  
6         519.03088927  Tecnica 4  
7          79.35498047  Tecnica 7  
8         394.42615819  Tecnica 7  
9          75.42357993  Tecnica 1  
10          4.07381248  Tecnica 1  
11        132.11596417  Tecnica 2  
12        817.67223072  Tecnica 2  
13        511.14025688  Tecnica 1  
Resultados finales guardados como 'resultados_finales_mejor_modelo.csv'

Identificar las Métricas Candidatas al mejor Modelo¶

Se comparan las métricas obtenidas en la aplicación de las técnicas estudias para identificar cuales podrían ser los algorítmos/modelos a tener sus hiperparámetros mejorados en la etapa final del proyecto donde vamo a elegir el unico modelo con mejores métricas para ponerlo en producción en la sección "Selección de Algoritmos/Modelos".

Gráfico de Dispersión (AUPRC vs Recall)¶

Este gráfico compara el rendimiento de los modelos en dos métricas clave:

  • AUPRC (%): Área bajo la curva Precision-Recall, que mide la calidad general de la detección.
  • Recall (%): Capacidad del modelo para detectar correctamente las transacciones fraudulentas.

El objetivo es identificar el modelo con mayor Recall y AUPRC, lo que indica un rendimiento superior en la detección de fraudes.

In [ ]:
# Gráfico de Dispersión
plt.figure(figsize=(10, 6))

# Crear gráfico
plt.scatter(final_results_mejor_modelo['Recall'], final_results_mejor_modelo['AUPRC'],
            c='orange', s=150, edgecolor='black', alpha=0.7)

# Etiquetas de cada punto con Modelo + Tecnica
texts = []
for i in range(len(final_results_mejor_modelo)):
    model_tecnica = f"{final_results_mejor_modelo['Modelo'][i]} ({final_results_mejor_modelo['Tecnica'][i]})"
    texts.append(plt.text(final_results_mejor_modelo['Recall'][i],
                          final_results_mejor_modelo['AUPRC'][i],
                          model_tecnica, fontsize=9, ha='center', va='center'))

# Ajustar etiquetas automáticamente para evitar solapamientos
adjust_text(texts, arrowprops=dict(arrowstyle="->", color='gray', lw=0.5))

# Configuraciones del gráfico
plt.title("Comparación de Modelos: AUPRC vs Recall", fontsize=14, fontweight='bold')
plt.xlabel("Recall (%) - Detección de Fraudes", fontsize=12)
plt.ylabel("AUPRC (%) - Rendimiento en Precision-Recall", fontsize=12)
plt.grid(True, linestyle='--', alpha=0.5)
plt.tight_layout()

# Mostrar gráfico
plt.show()

# Texto final explicativo
print("\n🔍 **Nota:**")
print("1. Los modelos más deseables tienen valores más altos tanto en Recall como en AUPRC.")
print("2. El modelo que se encuentra más arriba y a la derecha del gráfico es el mejor para detectar fraudes con alto rendimiento.\n")

# Legenda
print("\n**Leyenda:**")
print("Tecnica 1: Dataset Original")
print("Tecnica 2: Balanceado con SMOTE")
print("Tecnica 3: Balanceado con RandomUnderSampler")
print("Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE")
print("Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)")
print("Tecnica 6: Ensembles con SMOTE")
print("Tecnica 7: Ensembles con dataset original (clean_data)\n")
No description has been provided for this image
🔍 **Nota:**
1. Los modelos más deseables tienen valores más altos tanto en Recall como en AUPRC.
2. El modelo que se encuentra más arriba y a la derecha del gráfico es el mejor para detectar fraudes con alto rendimiento.


**Leyenda:**
Tecnica 1: Dataset Original
Tecnica 2: Balanceado con SMOTE
Tecnica 3: Balanceado con RandomUnderSampler
Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE
Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)
Tecnica 6: Ensembles con SMOTE
Tecnica 7: Ensembles con dataset original (clean_data)


👁️ Observación:

Este gráfico permite identificar el equilibrio entre AUPRC y Recall:

  • CatBoost (Tecnica 5) es el modelo con mayor Recall (85.7%) y AUPRC (85.3%), colocándose como el modelo ganador.
  • XGBoost (Tecnica 5) sigue de cerca con valores ligeramente inferiores, mostrando un rendimiento consistente.
  • Modelos como CatBoost (Tecnica 4) y XGBoost (Tecnica 4) ofrecen un equilibrio destacable, pero con menor Recall.

📌 Interpretación: Los modelos ubicados en la parte superior derecha del gráfico son los más eficientes en términos de detección de fraudes (Recall) y rendimiento en Precision-Recall (AUPRC).


Comparación de Métricas Clave¶

Este gráfico tiene como objetivo, comparar modelos en términos de sus métricas de rendimiento clave, permitiendo identificar cuáles ofrecen un mejor desempeño general. Muestra el rendimiento de cada modelo en las métricas clave: AUPRC, Recall, F1-Score y MCC.

In [ ]:
# Seleccionar métricas para el gráfico
metricas_clave = ['AUPRC', 'Recall', 'F1-Score', 'MCC']

# Actualizar las etiquetas con Modelo + Tecnica
final_results_mejor_modelo['Modelo_Tecnica'] = final_results_mejor_modelo['Modelo'] + " (" + final_results_mejor_modelo['Tecnica'] + ")"

modelos = final_results_mejor_modelo['Modelo_Tecnica']
valores = final_results_mejor_modelo[metricas_clave]

# Configurar la posición de las barras
x = np.arange(len(modelos))  # Posiciones de los modelos en el eje X
bar_width = 0.2  # Ancho de cada grupo de barras

# Colores de las métricas
colors = ['#1f77b4', '#ff7f0e', '#2ca02c', '#d62728']  # AUPRC, Recall, F1-Score, MCC

# Crear la figura
plt.figure(figsize=(14, 8))

# Dibujar barras individuales para cada métrica
for i, metric in enumerate(metricas_clave):
    plt.bar(x + i * bar_width, final_results_mejor_modelo[metric], width=bar_width, color=colors[i],
            label=metric, edgecolor='black')

# Etiquetas de los modelos en el eje X
plt.xticks(x + (bar_width * (len(metricas_clave) - 1)) / 2, modelos, rotation=90, ha='center')

# Configuraciones del gráfico
plt.title("Comparación de Métricas Clave por Modelo (Barras Comparativas)", fontsize=14, fontweight='bold')
plt.ylabel("Valor (%)", fontsize=12)
plt.xlabel("Modelos", fontsize=12)
plt.legend(title="Métricas Clave", bbox_to_anchor=(1.05, 1), loc='upper left')
plt.grid(axis='y', linestyle='--', alpha=0.5)
plt.tight_layout()

# Mostrar el gráfico
plt.show()

# Texto final explicativo
print("\n🔍 **Nota:**")
print("1. Cada modelo tiene 4 barras adyacentes que representan sus métricas clave (AUPRC, Recall, F1-Score y MCC).")
print("2. Permite comparar directamente el rendimiento de cada métrica entre los modelos.")
print("3. Un modelo ideal tendrá valores más altos en todas las métricas clave.\n")

# Legenda
print("\n**Leyenda:**")
print("Tecnica 1: Dataset Original")
print("Tecnica 2: Balanceado con SMOTE")
print("Tecnica 3: Balanceado con RandomUnderSampler")
print("Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE")
print("Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)")
print("Tecnica 6: Ensembles con SMOTE")
print("Tecnica 7: Ensembles con dataset original (clean_data)\n")
No description has been provided for this image
🔍 **Nota:**
1. Cada modelo tiene 4 barras adyacentes que representan sus métricas clave (AUPRC, Recall, F1-Score y MCC).
2. Permite comparar directamente el rendimiento de cada métrica entre los modelos.
3. Un modelo ideal tendrá valores más altos en todas las métricas clave.


**Leyenda:**
Tecnica 1: Dataset Original
Tecnica 2: Balanceado con SMOTE
Tecnica 3: Balanceado con RandomUnderSampler
Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE
Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)
Tecnica 6: Ensembles con SMOTE
Tecnica 7: Ensembles con dataset original (clean_data)


👁️ Observación:

En este gráfico se comparan las métricas clave de AUPRC, Recall, F1-Score, y MCC para cada modelo.

  • CatBoost (Tecnica 5) y XGBoost (Tecnica 5) destacan con los valores más altos en AUPRC, Recall, y F1-Score, lo que refleja una combinación sólida entre precisión y capacidad de detección de fraudes.
  • CatBoost (Tecnica 4) también sobresale por su rendimiento balanceado en todas las métricas clave, mostrando una alternativa competitiva.

📌 Observación: La evaluación de estas métricas permite una comparación directa y detallada entre los modelos evaluados.


In [ ]:
# Seleccionar métricas para el gráfico
metricas_clave = ['AUPRC', 'Recall', 'F1-Score', 'MCC']

# Actualizar las etiquetas con Modelo + Tecnica
final_results_mejor_modelo['Modelo_Tecnica'] = final_results_mejor_modelo['Modelo'] + " (" + final_results_mejor_modelo['Tecnica'] + ")"

modelos = final_results_mejor_modelo['Modelo_Tecnica']
valores = final_results_mejor_modelo[metricas_clave]

# Gráfico de barras apiladas
plt.figure(figsize=(14, 8))  # Aumentar tamaño de la figura
bar_width = 0.8  # Grosor más amplio de las barras

# Generar barras apiladas
bottom = np.zeros(len(modelos))  # Inicializar acumulador
colors = ['#1f77b4', '#ff7f0e', '#2ca02c', '#d62728']  # Colores de las métricas

for i, metric in enumerate(metricas_clave):
    bars = plt.bar(modelos, final_results_mejor_modelo[metric], bottom=bottom,
                   color=colors[i],
                   label=metric, edgecolor='black', width=bar_width)

    # Añadir los valores dentro de las barras
    for bar, value in zip(bars, final_results_mejor_modelo[metric]):
        plt.text(bar.get_x() + bar.get_width() / 2,  # Posición X
                 bar.get_y() + bar.get_height() / 2,  # Posición Y
                 f"{value:.1f}",  # Formato del valor
                 ha='center', va='center', fontsize=10,
                 color='white', fontweight='bold')  # Texto en blanco y negrita

    bottom += final_results_mejor_modelo[metric]

# Configuraciones del gráfico
plt.title("Comparación de Métricas Clave por Modelo (Barras Apiladas)",
          fontsize=14, fontweight='bold')
plt.ylabel("Valor (%)", fontsize=12)
plt.xticks(rotation=90, ha='center')  # Etiquetas del eje X perpendiculares
plt.legend(bbox_to_anchor=(1.05, 1), loc='upper left')  # Leyenda fuera del área del gráfico
plt.grid(axis='y', linestyle='--', alpha=0.5)
plt.tight_layout()

# Mostrar gráfico
plt.show()

# Texto final explicativo
print("\n🔍 **Nota:**")
print("1. Las barras más altas representan modelos con mejor rendimiento \
general en todas las métricas clave.")

print("2. Compara modelos según su desempeño en AUPRC, Recall, F1-Score y MCC.")

print("3. El modelo con mayor altura general y equilibrio entre métricas es \
probablemente la mejor elección.\n")

# Legenda
print("\n**Leyenda:**")
print("Tecnica 1: Dataset Original")
print("Tecnica 2: Balanceado con SMOTE")
print("Tecnica 3: Balanceado con RandomUnderSampler")
print("Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE")
print("Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)")
print("Tecnica 6: Ensembles con SMOTE")
print("Tecnica 7: Ensembles con dataset original (clean_data)\n")
No description has been provided for this image
🔍 **Nota:**
1. Las barras más altas representan modelos con mejor rendimiento general en todas las métricas clave.
2. Compara modelos según su desempeño en AUPRC, Recall, F1-Score y MCC.
3. El modelo con mayor altura general y equilibrio entre métricas es probablemente la mejor elección.


**Leyenda:**
Tecnica 1: Dataset Original
Tecnica 2: Balanceado con SMOTE
Tecnica 3: Balanceado con RandomUnderSampler
Tecnica 4: Validación Cruzada de mejores Modelos (Balanceado con SMOTE
Tecnica 5: Validación Cruzada de Mejores Modelos (dataset original)
Tecnica 6: Ensembles con SMOTE
Tecnica 7: Ensembles con dataset original (clean_data)

Tabla Resumen de la Tasa de Falsos Negativos (FNR):¶

Esta tabla muestra el FNR (%) y Recall (%) de todos los modelos comparados. Es útil para observar valores precisos.

In [ ]:
# Calcular FNR y crear tabla resumen
fnr_recall_df = pd.DataFrame({
    "Modelo": final_results_mejor_modelo['Modelo'],
    "Recall (%)": final_results_mejor_modelo['Recall'],
    "FNR (%)": 100 - final_results_mejor_modelo['Recall'],  # FNR = 100% - Recall (%)
    "Tecnica": final_results_mejor_modelo['Tecnica']
})

# Mostrar tabla ordenada por FNR ascendente
fnr_recall_df = fnr_recall_df.sort_values(by="FNR (%)", ascending=True, ignore_index=True)
print("📋 **Resumen de la Tasa de Falsos Negativos (FNR) y Recall**")
display(fnr_recall_df)
📋 **Resumen de la Tasa de Falsos Negativos (FNR) y Recall**
Modelo Recall (%) FNR (%) Tecnica
0 XGBoost 83.15789474 16.84210526 Tecnica 2
1 XGBoost, CatBoost (VotingClassifier) 83.15789474 16.84210526 Tecnica 6
2 CatBoost 83.15789474 16.84210526 Tecnica 2
3 XGBoost 82.55087719 17.44912281 Tecnica 4
4 CatBoost 82.54385965 17.45614035 Tecnica 4
5 XGBoost, CatBoost (StackingClassifier) 82.10526316 17.89473684 Tecnica 6
6 Random Forest 81.05263158 18.94736842 Tecnica 2
7 XGBoost, CatBoost (VotingClassifier) 80.00000000 20.00000000 Tecnica 7
8 CatBoost 80.00000000 20.00000000 Tecnica 1
9 CatBoost 79.37192982 20.62807018 Tecnica 5
10 XGBoost 78.94736842 21.05263158 Tecnica 1
11 XGBoost 77.25263158 22.74736842 Tecnica 5
12 XGBoost, CatBoost (StackingClassifier) 76.84210526 23.15789474 Tecnica 7
13 Random Forest 76.84210526 23.15789474 Tecnica 1
Comparación de la Tasa de Falsos Negativos (FNR)¶

El objetivo de este gráfico es comparar la Tasa de Falsos Negativos (FNR) entre los diferentes modelos evaluados. La FNR mide la proporción de transacciones fraudulentas que no fueron detectadas como tales, siendo un valor crítico en proyectos de detección de fraudes.

Este gráfico te ayuda a identificar los modelos con menor FNR, lo cual es crucial para problemas como la detección de fraudes.

In [ ]:
# Ordenar el dataframe por FNR en orden ascendente
fnr_recall_df_sorted = fnr_recall_df.sort_values(by="FNR (%)", ascending=True)

# Crear una nueva columna con nombres abreviados (Modelo 1, Modelo 2, etc.)
fnr_recall_df_sorted["Codigo"] = [f"Modelo {i+1}" for i in range(len(fnr_recall_df_sorted))]

# Crear un diccionario para la leyenda que mapea código -> nombre completo
codigo_to_nombre = dict(zip(fnr_recall_df_sorted["Codigo"], fnr_recall_df_sorted["Modelo"]))

# Gráfico de barras horizontales con nombres abreviados
plt.figure(figsize=(12, 7))
bars = plt.barh(fnr_recall_df_sorted["Codigo"], fnr_recall_df_sorted["FNR (%)"],
                color='orange', edgecolor='black', height=0.6)

# Mostrar el valor exacto de cada barra
for bar in bars:
    width = bar.get_width()
    plt.text(width - 0.5, bar.get_y() + bar.get_height()/2, f"{width:.2f}",
             color='white', fontsize=10, ha='right', va='center', fontweight='bold')

# Configuraciones del gráfico
plt.xlabel("Tasa de Falsos Negativos (FNR) (%)", fontsize=12)
plt.ylabel("Modelos", fontsize=12)
plt.title("Comparación de la Tasa de Falsos Negativos (FNR) por Modelo", fontsize=14, fontweight='bold')
plt.xlim(0, fnr_recall_df_sorted["FNR (%)"].max() + 5)  # Ajustar el eje X para visibilidad
plt.gca().invert_yaxis()  # Invertir eje Y para mostrar el menor FNR arriba
plt.grid(axis='x', linestyle='--', alpha=0.5)

# Agregar una leyenda con nombres completos
handles = [plt.Rectangle((0, 0), 1, 1, color='orange', edgecolor='black') for _ in codigo_to_nombre]
plt.legend(handles, [f"{codigo}: {nombre}" for codigo, nombre in codigo_to_nombre.items()],
           title="Leyenda de Modelos", bbox_to_anchor=(1.05, 1), loc='upper left')

# Mostrar gráfico
plt.tight_layout()
plt.show()

# Texto final explicativo
print("\n🔍 **Nota:**")
print("1. Las barras están ordenadas de menor a mayor FNR para facilitar la comparación.")
print("2. Se utiliza una leyenda con nombres completos para evitar sobrecargar el eje Y.")
print("3. Los modelos en la parte superior tienen menor FNR, lo que indica un mejor rendimiento al reducir falsos negativos.")
No description has been provided for this image
🔍 **Nota:**
1. Las barras están ordenadas de menor a mayor FNR para facilitar la comparación.
2. Se utiliza una leyenda con nombres completos para evitar sobrecargar el eje Y.
3. Los modelos en la parte superior tienen menor FNR, lo que indica un mejor rendimiento al reducir falsos negativos.

👁️ Observación:

La Tasa de Falsos Negativos (FNR) evalúa la proporción de fraudes no detectados:

  • CatBoost (Tecnica 5) logra la menor tasa de falsos negativos (15.06%), seguido por XGBoost (Tecnica 5) con un FNR de 17.98%.
  • Modelos con tasas más altas, como XGBoost, CatBoost (StackingClassifier) y otros modelos intermedios, presentan FNR superiores a 26%, lo que refleja debilidades en la detección de fraudes.

📌 Importancia: Un bajo FNR es crucial en aplicaciones de fraude, donde minimizar falsos negativos evita pérdidas significativas.


FNR vs Recall¶

Este gráfico te ayuda a visualizar el trade-off entre Recall y FNR, mostrando qué tan bien cada modelo equilibra estas métricas.

In [ ]:
# Texto inicial explicativo
print("📊 **Gráfico de Dispersión: FNR vs Recall**")
print("Este gráfico compara la Tasa de Falsos Negativos (FNR) y el Recall para cada modelo, mostrando su rendimiento en la detección de fraudes.")

# Gráfico de dispersión
plt.figure(figsize=(10, 6))
plt.scatter(fnr_recall_df["FNR (%)"], fnr_recall_df["Recall (%)"], color='red', s=100, edgecolor='black')

# Etiquetas de cada modelo con ajuste
#texts = []
#for i in range(len(fnr_recall_df)):
#    texts.append(plt.text(fnr_recall_df["FNR (%)"].iloc[i],
#                          fnr_recall_df["Recall (%)"].iloc[i],
#                          fnr_recall_df["Modelo"].iloc[i], fontsize=9, ha='center', va='center'))

    # Etiquetas de cada punto con Modelo + Tecnica
texts = []
for i in range(len(fnr_recall_df)):
    model_tecnica = f"{fnr_recall_df['Modelo'][i]} ({fnr_recall_df['Tecnica'][i]})"
    texts.append(plt.text(fnr_recall_df["FNR (%)"].iloc[i],
                          fnr_recall_df["Recall (%)"].iloc[i],
                          model_tecnica, fontsize=9, ha='center', va='center'))

# Ajustar etiquetas automáticamente
adjust_text(texts, arrowprops=dict(arrowstyle="->", color='gray', lw=0.5))

# Configuraciones del gráfico
plt.title("Tasa de Falsos Negativos (FNR) vs Recall (%)", fontsize=14, fontweight='bold')
plt.xlabel("FNR (%) - Tasa de Falsos Negativos", fontsize=12)
plt.ylabel("Recall (%) - Detección de Fraudes", fontsize=12)
plt.grid(True, linestyle='--', alpha=0.5)
plt.tight_layout()

# Mostrar gráfico
plt.show()

# Texto final explicativo
print("Nota: Este gráfico permite identificar modelos que logran un equilibrio óptimo entre reducir falsos negativos y aumentar la detección de fraudes.")
📊 **Gráfico de Dispersión: FNR vs Recall**
Este gráfico compara la Tasa de Falsos Negativos (FNR) y el Recall para cada modelo, mostrando su rendimiento en la detección de fraudes.
0 [0.12286263 0.08434267]
1 [0.31470938 0.06064995]
2 [0.65245578 0.75790533]
7 [ 0.20272684 -0.01728753]
8 [-0.67865806  0.30877244]
12 [-0.24729851  0.88809644]
13 [-0.20487135 -0.77878744]
No description has been provided for this image
Nota: Este gráfico permite identificar modelos que logran un equilibrio óptimo entre reducir falsos negativos y aumentar la detección de fraudes.

👁️ Observación:

El gráfico muestra la relación inversa entre Recall y FNR:

  • Modelos con bajo FNR (izquierda) y alto Recall (arriba) son los mejores.

📌 Observación: Este gráfico valida que un alto Recall se corresponde con una menor tasa de falsos negativos en modelos optimizados.


Resultados y Análisis¶

Tras analizar las métricas clave y las visualizaciones generadas, se concluye lo siguiente:

CatBoost y XGBoost han obtenido las mejores métricas en todas las técnicas que hemos probado.

Selección de Algoritmos/Modelos para ponerlo en producción¶

Llegando a este punto, ya tenemos claro que algoritmos/modelos trabajan mejor con el proceso de clasificación de transacciones fraudulentas.

El algortimo/modelo que obtener las mejores métricas (AUPRC, Recall, Precison y F1-Score) vamos volver a entrenarlos con los mejores hiperparámetros y generaremos el archivo .pkl para ponerlo en producción.

Para mejorar las métricas usaremos el método de balanceado de clases SMOTE y lo aplicaremos solamente al conjunto de datos de entrenamiento pero, para probar los modelos usaremos el conjunto de datos originales en:

  • Validación Cruzada con cross_val_score (con SMOTE)
  • Optimización de Hiperparámetros con GridSearchCV (con SMOTE)
  • Optimización de Hiperparámetros con Optuna (con SMOTE)

El dataset original esta limpio y se llama clean_data. Para más información ver las Secciones "Limpieza de datos" y "Separar el dataset".

Función para Entrenamiento, Evaluación y Detección de Sobreajuste (Overfitting)¶

Esta función entrena un modelo de Machine Learning, calcula las métricas de rendimiento tanto en el conjunto de entrenamiento como en el de prueba, y compara dichas métricas para detectar posibles problemas de overfitting.

In [ ]:
from sklearn.metrics import (
    precision_score, recall_score, accuracy_score, f1_score, average_precision_score
)
from inspect import signature

def entrenar_y_evaluar(modelo, nombre_modelo, parametros,
                       X_train, y_train, X_test=None, y_test=None,
                       tecnica="", umbral_sobreajuste=10, resultados_df=None):
    """
    Entrena un modelo, calcula métricas, evalúa el sobreajuste y actualiza resultados_df.
    """
    if resultados_df is None:
        raise ValueError("El DataFrame resultados_df debe ser proporcionado como argumento.")

    # ========================
    # Entrenamiento del Modelo
    # ========================
    print(f"\n🚀 Entrenando {nombre_modelo} ({tecnica})...")
    model = modelo(**parametros)
    model.fit(X_train, y_train)

    # Usar datos de entrenamiento si no se proporciona un conjunto de prueba
    if X_test is None or y_test is None:
        X_test, y_test = X_train, y_train
        print("⚠️ Usando datos de entrenamiento como prueba (posible sesgo en métricas).")

    # ========================
    # Predicciones
    # ========================
    y_pred_train = model.predict(X_train)
    y_pred_proba_train = model.predict_proba(X_train)[:, 1]
    y_pred_test = model.predict(X_test)
    y_pred_proba_test = model.predict_proba(X_test)[:, 1]

    # ========================
    # Cálculo de Métricas
    # ========================
    metrics_train = {
        'Precision': precision_score(y_train, y_pred_train) * 100,
        'Recall': recall_score(y_train, y_pred_train) * 100,
        'AUPRC': average_precision_score(y_train, y_pred_proba_train) * 100,
        'Accuracy': accuracy_score(y_train, y_pred_train) * 100,
        'F1-Score': f1_score(y_train, y_pred_train) * 100
    }

    metrics_test = {
        'Precision': precision_score(y_test, y_pred_test) * 100,
        'Recall': recall_score(y_test, y_pred_test) * 100,
        'AUPRC': average_precision_score(y_test, y_pred_proba_test) * 100,
        'Accuracy': accuracy_score(y_test, y_pred_test) * 100,
        'F1-Score': f1_score(y_test, y_pred_test) * 100
    }

    # ========================
    # Detectar Sobreajuste
    # ========================
    sobreajuste = any(
        abs(metrics_train[metric] - metrics_test[metric]) > umbral_sobreajuste
        for metric in metrics_train.keys()
    )

    # ========================
    # Crear Diccionario de Resultados
    # ========================
    resultado = {
        'Modelo': nombre_modelo,
        'Tecnica': tecnica,
        'Sobreajuste': int(sobreajuste),
        **{f'{k}_Train': v for k, v in metrics_train.items()},
        **{f'{k}_Test': v for k, v in metrics_test.items()}
    }

    # Incluir parámetros dinámicamente
    parametros_validos = set(signature(modelo).parameters.keys())
    for key in parametros_validos:
        resultado[key] = parametros.get(key, None)

    # ========================
    # Actualizar el DataFrame
    # ========================
    resultados_df = pd.concat([resultados_df, pd.DataFrame([resultado])], ignore_index=True)

    # ========================
    # Imprimir Resultados
    # ========================
    print(f"\n✅ Resultados para {nombre_modelo} ({tecnica}):")
    for k, v in resultado.items():
        print(f" - {k}: {v:.2f}" if isinstance(v, float) else f" - {k}: {v}")
    print(f"✅ Tamaño del DataFrame actualizado: {resultados_df.shape}")

    return resultados_df
In [ ]:
from sklearn.metrics import (
    precision_score, recall_score, accuracy_score, f1_score, average_precision_score
)

def entrenar_y_evaluar(modelo, nombre_modelo, parametros,
                       X_train, y_train, X_test=None, y_test=None,
                       tecnica="", umbral_sobreajuste=5, resultados_df=None):
    """
    Entrena un modelo, calcula métricas, evalúa el sobreajuste y actualiza resultados_df.
    """
    if resultados_df is None:
        raise ValueError("El DataFrame resultados_df debe ser proporcionado como argumento.")

    # ========================
    # Entrenamiento del Modelo
    # ========================
    print(f"\n🚀 Entrenando {nombre_modelo} ({tecnica})...")
    model = modelo(**parametros)
    model.fit(X_train, y_train)

    # Usar datos de entrenamiento si no se proporciona un conjunto de prueba
    if X_test is None or y_test is None:
        X_test, y_test = X_train, y_train
        print("\u26a0\ufe0f Usando datos de entrenamiento como prueba (posible sesgo en métricas).")

    # ========================
    # Predicciones
    # ========================
    y_pred_train = model.predict(X_train)
    y_pred_proba_train = model.predict_proba(X_train)[:, 1]
    y_pred_test = model.predict(X_test)
    y_pred_proba_test = model.predict_proba(X_test)[:, 1]

    # ========================
    # Cálculo de Métricas
    # ========================
    metrics_train = {
        'Precision': precision_score(y_train, y_pred_train) * 100,
        'Recall': recall_score(y_train, y_pred_train) * 100,
        'AUPRC': average_precision_score(y_train, y_pred_proba_train) * 100,
        'Accuracy': accuracy_score(y_train, y_pred_train) * 100,
        'F1-Score': f1_score(y_train, y_pred_train) * 100
    }

    metrics_test = {
        'Precision': precision_score(y_test, y_pred_test) * 100,
        'Recall': recall_score(y_test, y_pred_test) * 100,
        'AUPRC': average_precision_score(y_test, y_pred_proba_test) * 100,
        'Accuracy': accuracy_score(y_test, y_pred_test) * 100,
        'F1-Score': f1_score(y_test, y_pred_test) * 100
    }

    # ========================
    # Detectar Sobreajuste
    # ========================
    sobreajuste = 0  # Inicializamos como 0 (sin sobreajuste)
    print(f"\n\n--- Comparación de Métricas: {nombre_modelo} ({tecnica}) ---\n")

    # Detectar sobreajuste explícito para métricas extremas
    if metrics_test['Recall'] == 100.0 or metrics_test['AUPRC'] == 100.0:
        sobreajuste = 1
    # Comparar diferencias entre métricas
    elif any(abs(metrics_train[metric] - metrics_test[metric]) > umbral_sobreajuste for metric in metrics_train.keys()):
        sobreajuste = 1

    # Mostrar métricas comparativas y diferencias
    for metric in metrics_train.keys():
        diff = abs(metrics_train[metric] - metrics_test[metric])
        print(f"{metric} - Entrenamiento: {metrics_train[metric]:.2f}%, Prueba: {metrics_test[metric]:.2f}%, Diferencia: {diff:.2f}%")
        if diff > umbral_sobreajuste:
            print(f"\u26a0\ufe0f Overfitting detectado en {metric} \n")

    # ========================
    # Crear Diccionario de Resultados
    # ========================
    resultado = {
        'Modelo': nombre_modelo,
        'Tecnica': tecnica,
        'Sobreajuste': sobreajuste,
        **{f'{k}_Train': v for k, v in metrics_train.items()},
        **{f'{k}_Test': v for k, v in metrics_test.items()},
    }

    # Incluir explícitamente parámetros al resultado
    for key in ['iterations', 'learning_rate', 'depth', 'class_weights', 'verbose',
                'max_depth', 'n_estimators', 'scale_pos_weight', 'min_child_weight',
                'gamma', 'l2_leaf_reg', 'subsample']:
        resultado[key] = parametros.get(key, None)

    # ========================
    # Actualizar el DataFrame
    # ========================
    resultados_df = pd.concat([resultados_df, pd.DataFrame([resultado])], ignore_index=True)

    # ========================
    # Imprimir Resultados
    # ========================
    print(f"\n✅ Resultados para {nombre_modelo} ({tecnica}):")
    for k, v in resultado.items():
        print(f" - {k}: {v}")

    #Logs Durante el Entrenamiento
    print(f"🚀 Modelo {nombre_modelo}, Fold {fold}: Entrenamiento exitoso.")
    print(f"Resultados actuales en {nombre_df}: {resultados_df.shape}")


    return resultados_df

1. Validación Cruzada con cross_val_score (ADASYN)¶

ADASYN es una variante de SMOTE que realiza oversampling generando ejemplos sintéticos de la clase minoritaria.

A diferencia de SMOTE, ADASYN prioriza generar más ejemplos sintéticos en las regiones donde el modelo tiene mayor dificultad (es decir, donde los datos están menos representados o son más difíciles de clasificar).

Utiliza una densidad adaptativa: se ajusta en función de qué tan difícil es clasificar un punto minoritario, generando más muestras en áreas complicadas.

In [ ]:
# ==========================================================
# 1. Validación Cruzada con ADASYN
# ==========================================================
%%time
from imblearn.over_sampling import ADASYN
from sklearn.model_selection import StratifiedKFold
from catboost import CatBoostClassifier
from xgboost import XGBClassifier
import pandas as pd

# Datos originales
X = X_train  # Usar datos originales de entrenamiento
y = y_train  # Etiquetas de entrenamiento

# Configuraciones de hiperparámetros
parametros_catboost = {
    'iterations': 300,
    'learning_rate': 0.03,
    'depth': 4,
    'l2_leaf_reg': 10.0,
    'class_weights': [1, 10],
    'verbose': 50
}

parametros_xgb = {
    'learning_rate': 0.03,
    'max_depth': 4,
    'n_estimators': 300,
    'min_child_weight': 2,
    'scale_pos_weight': 10,
    'subsample': 0.8,
    'gamma': 0.2
}

# Inicializar DataFrame específico para ADASYN
columnas_resultados = [
    'Modelo', 'Tecnica', 'Sobreajuste',
    'Precision_Train', 'Recall_Train', 'AUPRC_Train', 'Accuracy_Train', 'F1-Score_Train',
    'Precision_Test', 'Recall_Test', 'AUPRC_Test', 'Accuracy_Test', 'F1-Score_Test',
    'iterations', 'learning_rate', 'depth', 'class_weights', 'verbose',
    'max_depth', 'n_estimators', 'scale_pos_weight', 'min_child_weight',
    'gamma', 'l2_leaf_reg', 'subsample', 'Fold'
]
resultados_adasyn = pd.DataFrame(columns=columnas_resultados)

# Estrategia de validación cruzada
kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=42)

# ================================================
# Evaluación con CatBoost usando ADASYN
# ================================================
print("\n🚀 Evaluación con CatBoost usando ADASYN en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔄 Fold {fold}:")
    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Aplicar ADASYN en el conjunto de entrenamiento
    print("📊 Aplicando ADASYN en el conjunto de entrenamiento...")
    adasyn = ADASYN(sampling_strategy='minority', random_state=42, n_neighbors=5)
    X_train_res, y_train_res = adasyn.fit_resample(X_train_fold, y_train_fold)

    # Entrenar y evaluar el modelo
    resultados_adasyn = entrenar_y_evaluar(
        modelo=CatBoostClassifier,
        nombre_modelo="CatBoost",
        parametros=parametros_catboost,
        X_train=X_train_res,
        y_train=y_train_res,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada con ADASYN",
        resultados_df=resultados_adasyn
    )
    resultados_adasyn['Fold'] = fold  # Añadir el fold al DataFrame

# ================================================
# Evaluación con XGBoost usando ADASYN
# ================================================
print("\n🚀 Evaluación con XGBoost usando ADASYN en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔄 Fold {fold}:")
    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Aplicar ADASYN en el conjunto de entrenamiento
    print("📊 Aplicando ADASYN en el conjunto de entrenamiento...")
    adasyn = ADASYN(sampling_strategy='minority', random_state=42, n_neighbors=5)
    X_train_res, y_train_res = adasyn.fit_resample(X_train_fold, y_train_fold)

    # Entrenar y evaluar el modelo
    resultados_adasyn = entrenar_y_evaluar(
        modelo=XGBClassifier,
        nombre_modelo="XGBoost",
        parametros=parametros_xgb,
        X_train=X_train_res,
        y_train=y_train_res,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada con ADASYN",
        resultados_df=resultados_adasyn
    )
    resultados_adasyn['Fold'] = fold  # Añadir el fold al DataFrame

# ================================================
# Guardar Resultados
# ================================================
print("\n🏆 Resultados Finales Ordenados:")
resultados_ordenados = resultados_adasyn.sort_values(
    by=['AUPRC_Test', 'Recall_Test', 'Precision_Test', 'F1-Score_Test'],
    ascending=[False, False, False, False]
)
print(resultados_ordenados)

# Guardar resultados en CSV
resultados_adasyn.to_csv("resultados_adasyn.csv", index=False)
print("\n✅ Resultados guardados en 'resultados_adasyn.csv'")
🚀 Evaluación con CatBoost usando ADASYN en cada fold...

🔄 Fold 1:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6340803	total: 177ms	remaining: 53s
50:	learn: 0.1012081	total: 7.27s	remaining: 35.5s
100:	learn: 0.0611959	total: 12s	remaining: 23.5s
150:	learn: 0.0466757	total: 16.8s	remaining: 16.5s
200:	learn: 0.0365530	total: 25.5s	remaining: 12.6s
250:	learn: 0.0298269	total: 31.1s	remaining: 6.06s
299:	learn: 0.0250395	total: 40.3s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.33
 - Recall_Train: 100.00
 - AUPRC_Train: 99.74
 - Accuracy_Train: 97.00
 - F1-Score_Train: 97.08
 - Precision_Test: 2.64
 - Recall_Test: 94.74
 - AUPRC_Test: 70.43
 - Accuracy_Test: 94.09
 - F1-Score_Test: 5.14
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 135)

🔄 Fold 2:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6313038	total: 111ms	remaining: 33.1s
50:	learn: 0.1008489	total: 6.37s	remaining: 31.1s
100:	learn: 0.0604422	total: 12s	remaining: 23.6s
150:	learn: 0.0438902	total: 16.7s	remaining: 16.5s
200:	learn: 0.0341644	total: 23.9s	remaining: 11.8s
250:	learn: 0.0279447	total: 28.5s	remaining: 5.57s
299:	learn: 0.0229186	total: 33.2s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.81
 - Accuracy_Train: 97.21
 - F1-Score_Train: 97.28
 - Precision_Test: 2.55
 - Recall_Test: 92.11
 - AUPRC_Test: 60.14
 - Accuracy_Test: 94.05
 - F1-Score_Test: 4.97
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 135)

🔄 Fold 3:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6276704	total: 178ms	remaining: 53.3s
50:	learn: 0.0826680	total: 5.73s	remaining: 28s
100:	learn: 0.0510718	total: 10.4s	remaining: 20.4s
150:	learn: 0.0381166	total: 17.5s	remaining: 17.3s
200:	learn: 0.0308758	total: 22.2s	remaining: 10.9s
250:	learn: 0.0254456	total: 26.8s	remaining: 5.24s
299:	learn: 0.0213878	total: 34s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.94
 - Recall_Train: 100.00
 - AUPRC_Train: 99.81
 - Accuracy_Train: 97.33
 - F1-Score_Train: 97.40
 - Precision_Test: 2.75
 - Recall_Test: 86.84
 - AUPRC_Test: 69.56
 - Accuracy_Test: 94.78
 - F1-Score_Test: 5.32
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 135)

🔄 Fold 4:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6328230	total: 100ms	remaining: 30s
50:	learn: 0.1022797	total: 4.71s	remaining: 23s
100:	learn: 0.0634689	total: 11.8s	remaining: 23.3s
150:	learn: 0.0466170	total: 16.5s	remaining: 16.3s
200:	learn: 0.0366830	total: 21.5s	remaining: 10.6s
250:	learn: 0.0299165	total: 28.3s	remaining: 5.53s
299:	learn: 0.0249728	total: 32.9s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.37
 - Recall_Train: 100.00
 - AUPRC_Train: 99.79
 - Accuracy_Train: 97.01
 - F1-Score_Train: 97.10
 - Precision_Test: 2.73
 - Recall_Test: 97.37
 - AUPRC_Test: 69.31
 - Accuracy_Test: 94.13
 - F1-Score_Test: 5.31
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (4, 135)

🔄 Fold 5:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6355333	total: 95.3ms	remaining: 28.5s
50:	learn: 0.1033309	total: 7.25s	remaining: 35.4s
100:	learn: 0.0624388	total: 11.9s	remaining: 23.5s
150:	learn: 0.0470063	total: 17.4s	remaining: 17.2s
200:	learn: 0.0368880	total: 23.8s	remaining: 11.7s
250:	learn: 0.0304170	total: 28.5s	remaining: 5.56s
299:	learn: 0.0257307	total: 35s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.13
 - Recall_Train: 100.00
 - AUPRC_Train: 99.77
 - Accuracy_Train: 96.88
 - F1-Score_Train: 96.98
 - Precision_Test: 2.46
 - Recall_Test: 94.74
 - AUPRC_Test: 61.84
 - Accuracy_Test: 93.65
 - F1-Score_Test: 4.80
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (5, 135)

🔄 Fold 6:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6348474	total: 91.3ms	remaining: 27.3s
50:	learn: 0.1013061	total: 4.77s	remaining: 23.3s
100:	learn: 0.0610414	total: 10.9s	remaining: 21.6s
150:	learn: 0.0451405	total: 16.5s	remaining: 16.2s
200:	learn: 0.0361169	total: 21.1s	remaining: 10.4s
250:	learn: 0.0292808	total: 28.4s	remaining: 5.54s
299:	learn: 0.0241619	total: 33s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.45
 - Recall_Train: 100.00
 - AUPRC_Train: 99.79
 - Accuracy_Train: 97.06
 - F1-Score_Train: 97.14
 - Precision_Test: 2.58
 - Recall_Test: 94.74
 - AUPRC_Test: 69.79
 - Accuracy_Test: 93.94
 - F1-Score_Test: 5.02
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (6, 135)

🔄 Fold 7:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6334032	total: 93.7ms	remaining: 28s
50:	learn: 0.1056545	total: 7.25s	remaining: 35.4s
100:	learn: 0.0640097	total: 12.2s	remaining: 24s
150:	learn: 0.0467395	total: 19.8s	remaining: 19.5s
200:	learn: 0.0367313	total: 26.1s	remaining: 12.9s
250:	learn: 0.0306374	total: 30.8s	remaining: 6.01s
299:	learn: 0.0261628	total: 37.3s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.05
 - Recall_Train: 100.00
 - AUPRC_Train: 99.75
 - Accuracy_Train: 96.83
 - F1-Score_Train: 96.93
 - Precision_Test: 2.44
 - Recall_Test: 94.74
 - AUPRC_Test: 82.42
 - Accuracy_Test: 93.59
 - F1-Score_Test: 4.76
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (7, 135)

🔄 Fold 8:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6342208	total: 99.3ms	remaining: 29.7s
50:	learn: 0.0994658	total: 4.71s	remaining: 23s
100:	learn: 0.0607605	total: 11.1s	remaining: 21.9s
150:	learn: 0.0458248	total: 16.5s	remaining: 16.3s
200:	learn: 0.0362803	total: 21.2s	remaining: 10.4s
250:	learn: 0.0296385	total: 28.4s	remaining: 5.55s
299:	learn: 0.0248807	total: 35.5s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.43
 - Recall_Train: 100.00
 - AUPRC_Train: 99.79
 - Accuracy_Train: 97.05
 - F1-Score_Train: 97.14
 - Precision_Test: 2.68
 - Recall_Test: 94.74
 - AUPRC_Test: 69.48
 - Accuracy_Test: 94.17
 - F1-Score_Test: 5.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (8, 135)

🔄 Fold 9:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6378937	total: 156ms	remaining: 46.7s
50:	learn: 0.1028674	total: 6.98s	remaining: 34.1s
100:	learn: 0.0644691	total: 11.6s	remaining: 22.9s
150:	learn: 0.0476302	total: 17.6s	remaining: 17.4s
200:	learn: 0.0372335	total: 23.5s	remaining: 11.6s
250:	learn: 0.0308993	total: 28.2s	remaining: 5.51s
299:	learn: 0.0261942	total: 35.2s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 94.19
 - Recall_Train: 100.00
 - AUPRC_Train: 99.72
 - Accuracy_Train: 96.92
 - F1-Score_Train: 97.01
 - Precision_Test: 2.51
 - Recall_Test: 97.30
 - AUPRC_Test: 79.53
 - Accuracy_Test: 93.77
 - F1-Score_Test: 4.89
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (9, 135)

🔄 Fold 10:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando CatBoost (Validación Cruzada con ADASYN)...
0:	learn: 0.6353824	total: 96.1ms	remaining: 28.7s
50:	learn: 0.0941086	total: 4.67s	remaining: 22.8s
100:	learn: 0.0565049	total: 11.3s	remaining: 22.2s
150:	learn: 0.0418993	total: 16.3s	remaining: 16s
200:	learn: 0.0328070	total: 21s	remaining: 10.3s
250:	learn: 0.0264453	total: 28.2s	remaining: 5.5s
299:	learn: 0.0220465	total: 32.7s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con ADASYN):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 95.11
 - Recall_Train: 100.00
 - AUPRC_Train: 99.76
 - Accuracy_Train: 97.43
 - F1-Score_Train: 97.50
 - Precision_Test: 2.56
 - Recall_Test: 83.78
 - AUPRC_Test: 52.38
 - Accuracy_Test: 94.71
 - F1-Score_Test: 4.96
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (10, 135)

🚀 Evaluación con XGBoost usando ADASYN en cada fold...

🔄 Fold 1:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.36
 - Recall_Train: 100.00
 - AUPRC_Train: 99.92
 - Accuracy_Train: 98.11
 - F1-Score_Train: 98.15
 - Precision_Test: 4.10
 - Recall_Test: 94.74
 - AUPRC_Test: 72.46
 - Accuracy_Test: 96.24
 - F1-Score_Test: 7.85
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (11, 136)

🔄 Fold 2:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.55
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.21
 - F1-Score_Train: 98.24
 - Precision_Test: 3.97
 - Recall_Test: 92.11
 - AUPRC_Test: 70.20
 - Accuracy_Test: 96.22
 - F1-Score_Test: 7.61
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (12, 136)

🔄 Fold 3:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.72
 - Recall_Train: 100.00
 - AUPRC_Train: 99.95
 - Accuracy_Train: 98.31
 - F1-Score_Train: 98.33
 - Precision_Test: 4.12
 - Recall_Test: 86.84
 - AUPRC_Test: 72.85
 - Accuracy_Test: 96.56
 - F1-Score_Test: 7.87
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (13, 136)

🔄 Fold 4:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.11
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 97.97
 - F1-Score_Train: 98.01
 - Precision_Test: 3.75
 - Recall_Test: 94.74
 - AUPRC_Test: 71.90
 - Accuracy_Test: 95.88
 - F1-Score_Test: 7.21
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (14, 136)

🔄 Fold 5:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.11
 - Recall_Train: 100.00
 - AUPRC_Train: 99.92
 - Accuracy_Train: 97.98
 - F1-Score_Train: 98.02
 - Precision_Test: 3.45
 - Recall_Test: 89.47
 - AUPRC_Test: 65.77
 - Accuracy_Test: 95.75
 - F1-Score_Test: 6.65
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (15, 136)

🔄 Fold 6:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.31
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.09
 - F1-Score_Train: 98.12
 - Precision_Test: 3.89
 - Recall_Test: 92.11
 - AUPRC_Test: 77.85
 - Accuracy_Test: 96.14
 - F1-Score_Test: 7.47
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (16, 136)

🔄 Fold 7:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.21
 - Recall_Train: 100.00
 - AUPRC_Train: 99.92
 - Accuracy_Train: 98.03
 - F1-Score_Train: 98.07
 - Precision_Test: 3.70
 - Recall_Test: 89.47
 - AUPRC_Test: 86.93
 - Accuracy_Test: 96.04
 - F1-Score_Test: 7.10
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (17, 136)

🔄 Fold 8:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.29
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 98.08
 - F1-Score_Train: 98.11
 - Precision_Test: 3.88
 - Recall_Test: 92.11
 - AUPRC_Test: 74.77
 - Accuracy_Test: 96.13
 - F1-Score_Test: 7.45
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (18, 136)

🔄 Fold 9:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.91
 - Accuracy_Train: 98.00
 - F1-Score_Train: 98.04
 - Precision_Test: 3.72
 - Recall_Test: 97.30
 - AUPRC_Test: 86.43
 - Accuracy_Test: 95.85
 - F1-Score_Test: 7.17
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (19, 136)

🔄 Fold 10:
📊 Aplicando ADASYN en el conjunto de entrenamiento...

🚀 Entrenando XGBoost (Validación Cruzada con ADASYN)...

✅ Resultados para XGBoost (Validación Cruzada con ADASYN):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con ADASYN
 - Sobreajuste: 1
 - Precision_Train: 96.78
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 98.33
 - F1-Score_Train: 98.36
 - Precision_Test: 3.92
 - Recall_Test: 83.78
 - AUPRC_Test: 61.54
 - Accuracy_Test: 96.59
 - F1-Score_Test: 7.49
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (20, 136)

🏆 Resultados Finales Ordenados:
      Modelo                        Tecnica Sobreajuste  Precision_Train  \
16   XGBoost  Validación Cruzada con ADASYN           1      96.20755059   
18   XGBoost  Validación Cruzada con ADASYN           1      96.15107999   
6   CatBoost  Validación Cruzada con ADASYN           1      94.04552273   
8   CatBoost  Validación Cruzada con ADASYN           1      94.19235633   
15   XGBoost  Validación Cruzada con ADASYN           1      96.31339703   
17   XGBoost  Validación Cruzada con ADASYN           1      96.29465329   
12   XGBoost  Validación Cruzada con ADASYN           1      96.72300793   
10   XGBoost  Validación Cruzada con ADASYN           1      96.36309848   
13   XGBoost  Validación Cruzada con ADASYN           1      96.10526216   
0   CatBoost  Validación Cruzada con ADASYN           1      94.33436099   
11   XGBoost  Validación Cruzada con ADASYN           1      96.54880585   
5   CatBoost  Validación Cruzada con ADASYN           1      94.44699103   
2   CatBoost  Validación Cruzada con ADASYN           1      94.93553029   
7   CatBoost  Validación Cruzada con ADASYN           1      94.43472485   
3   CatBoost  Validación Cruzada con ADASYN           1      94.36530858   
14   XGBoost  Validación Cruzada con ADASYN           1      96.11155014   
4   CatBoost  Validación Cruzada con ADASYN           1      94.13349994   
19   XGBoost  Validación Cruzada con ADASYN           1      96.77563150   
1   CatBoost  Validación Cruzada con ADASYN           1      94.70936695   
9   CatBoost  Validación Cruzada con ADASYN           1      95.11296614   

    Recall_Train  AUPRC_Train  Accuracy_Train  F1-Score_Train  Precision_Test  \
16  100.00000000  99.92129132     98.02915406     98.06712361      3.69565217   
18  100.00000000  99.91340421     97.99866264     98.03777782      3.72285419   
6   100.00000000  99.74886719     96.83446182     96.93140188      2.44233379   
8   100.00000000  99.72372184     96.91738075     97.00933457      2.50871080   
15  100.00000000  99.93786346     98.08606635     98.12208284      3.89321468   
17  100.00000000  99.92875825     98.07618486     98.11235475      3.88457270   
12  100.00000000  99.94723629     98.30606695     98.33421006      4.11985019   
10  100.00000000  99.92033163     98.11270298     98.14786915      4.09556314   
13  100.00000000  99.93146705     97.97363723     98.01395546      3.75000000   
0    99.99950483  99.74131671     96.99646906     97.08435890      2.64123258   
11  100.00000000  99.93518323     98.21262750     98.24410322      3.96825397   
5   100.00000000  99.78938452     97.06013451     97.14420422      2.57879656   
2   100.00000000  99.80942516     97.33279841     97.40197710      2.74542429   
7   100.00000000  99.78951154     97.05360084     97.13771542      2.67657993   
3   100.00000000  99.79170410     97.01431554     97.10097884      2.73062731   
14  100.00000000  99.91732607     97.97719641     98.01722548      3.45177665   
4   100.00000000  99.77038048     96.88407017     96.97811039      2.46238030   
19  100.00000000  99.93084274     98.33411341     98.36139847      3.91908976   
1   100.00000000  99.80683151     97.20676616     97.28280507      2.55474453   
9    99.99950471  99.76113123     97.43071601     97.49504441      2.55564716   

    Recall_Test  AUPRC_Test  Accuracy_Test  F1-Score_Test iterations  \
16  89.47368421 86.92972462    96.03951584     7.09812109        NaN   
18  97.29729730 86.42976914    95.85243202     7.17131474        NaN   
6   94.73684211 82.42106677    93.59202563     4.76190476        300   
8   97.29729730 79.53311211    93.76974767     4.89130435        300   
15  92.10526316 77.85042296    96.14186543     7.47065101        NaN   
17  92.10526316 74.76607947    96.13296547     7.45473908        NaN   
12  86.84210526 72.85045437    96.56016376     7.86650775        NaN   
10  94.73684211 72.46183929    96.23976504     7.85169029        NaN   
13  94.73684211 71.90458054    95.87931648     7.21442886        NaN   
0   94.73684211 70.43271883    94.08597366     5.13918630        300   
11  92.10526316 70.20320553    96.21751513     7.60869565        NaN   
5   94.73684211 69.79132382    93.93912424     5.02092050        300   
2   86.84210526 69.56248215    94.77572090     5.32258065        300   
7   94.73684211 69.47882786    94.16607334     5.20607375        300   
3   97.36842105 69.31138198    94.13047348     5.31227566        300   
14  89.47368421 65.77215346    95.75026700     6.64711632        NaN   
4   94.73684211 61.84056975    93.64542542     4.80000000        300   
19  83.78378378 61.54140024    96.59116194     7.48792271        NaN   
1   92.10526316 60.14433878    94.04592382     4.97159091        300   
9   83.78378378 52.38236731    94.71318588     4.96000000        300   

    learning_rate depth class_weights verbose max_depth n_estimators  \
16            NaN   NaN           NaN     NaN      None         None   
18            NaN   NaN           NaN     NaN      None         None   
6      0.03000000     4       [1, 10]      50      None         None   
8      0.03000000     4       [1, 10]      50      None         None   
15            NaN   NaN           NaN     NaN      None         None   
17            NaN   NaN           NaN     NaN      None         None   
12            NaN   NaN           NaN     NaN      None         None   
10            NaN   NaN           NaN     NaN      None         None   
13            NaN   NaN           NaN     NaN      None         None   
0      0.03000000     4       [1, 10]      50      None         None   
11            NaN   NaN           NaN     NaN      None         None   
5      0.03000000     4       [1, 10]      50      None         None   
2      0.03000000     4       [1, 10]      50      None         None   
7      0.03000000     4       [1, 10]      50      None         None   
3      0.03000000     4       [1, 10]      50      None         None   
14            NaN   NaN           NaN     NaN      None         None   
4      0.03000000     4       [1, 10]      50      None         None   
19            NaN   NaN           NaN     NaN       NaN          NaN   
1      0.03000000     4       [1, 10]      50      None         None   
9      0.03000000     4       [1, 10]      50      None         None   

   scale_pos_weight min_child_weight gamma  l2_leaf_reg subsample  Fold  \
16             None              NaN   NaN          NaN      None    10   
18             None              NaN   NaN          NaN      None    10   
6              None              NaN   NaN  10.00000000      None    10   
8              None              NaN   NaN  10.00000000      None    10   
15             None              NaN   NaN          NaN      None    10   
17             None              NaN   NaN          NaN      None    10   
12             None              NaN   NaN          NaN      None    10   
10             None              NaN   NaN          NaN      None    10   
13             None              NaN   NaN          NaN      None    10   
0              None              NaN   NaN  10.00000000      None    10   
11             None              NaN   NaN          NaN      None    10   
5              None              NaN   NaN  10.00000000      None    10   
2              None              NaN   NaN  10.00000000      None    10   
7              None              NaN   NaN  10.00000000      None    10   
3              None              NaN   NaN  10.00000000      None    10   
14             None              NaN   NaN          NaN      None    10   
4              None              NaN   NaN  10.00000000      None    10   
19              NaN              NaN   NaN          NaN       NaN    10   
1              None              NaN   NaN  10.00000000      None    10   
9              None              NaN   NaN  10.00000000      None    10   

   model_shrink_mode per_feature_ctr   eta devices  \
16              None            None  None    None   
18              None            None  None    None   
6               None            None  None    None   
8               None            None  None    None   
15              None            None  None    None   
17              None            None  None    None   
12              None            None  None    None   
10              None            None  None    None   
13              None            None  None    None   
0               None            None  None    None   
11              None            None  None    None   
5               None            None  None    None   
2               None            None  None    None   
7               None            None  None    None   
3               None            None  None    None   
14              None            None  None    None   
4               None            None  None    None   
19               NaN             NaN   NaN     NaN   
1               None            None  None    None   
9               None            None  None    None   

   per_object_feature_penalties allow_const_label mvs_reg  \
16                         None              None    None   
18                         None              None    None   
6                          None              None    None   
8                          None              None    None   
15                         None              None    None   
17                         None              None    None   
12                         None              None    None   
10                         None              None    None   
13                         None              None    None   
0                          None              None    None   
11                         None              None    None   
5                          None              None    None   
2                          None              None    None   
7                          None              None    None   
3                          None              None    None   
14                         None              None    None   
4                          None              None    None   
19                          NaN               NaN     NaN   
1                          None              None    None   
9                          None              None    None   

   dev_score_calc_obj_block_size ctr_leaf_count_limit max_ctr_complexity  \
16                          None                 None               None   
18                          None                 None               None   
6                           None                 None               None   
8                           None                 None               None   
15                          None                 None               None   
17                          None                 None               None   
12                          None                 None               None   
10                          None                 None               None   
13                          None                 None               None   
0                           None                 None               None   
11                          None                 None               None   
5                           None                 None               None   
2                           None                 None               None   
7                           None                 None               None   
3                           None                 None               None   
14                          None                 None               None   
4                           None                 None               None   
19                           NaN                  NaN                NaN   
1                           None                 None               None   
9                           None                 None               None   

   target_border metric_period eval_fraction allow_writing_files  \
16          None          None          None                None   
18          None          None          None                None   
6           None          None          None                None   
8           None          None          None                None   
15          None          None          None                None   
17          None          None          None                None   
12          None          None          None                None   
10          None          None          None                None   
13          None          None          None                None   
0           None          None          None                None   
11          None          None          None                None   
5           None          None          None                None   
2           None          None          None                None   
7           None          None          None                None   
3           None          None          None                None   
14          None          None          None                None   
4           None          None          None                None   
19           NaN           NaN           NaN                 NaN   
1           None          None          None                None   
9           None          None          None                None   

   save_snapshot classes_count ctr_description leaf_estimation_method  \
16          None          None            None                   None   
18          None          None            None                   None   
6           None          None            None                   None   
8           None          None            None                   None   
15          None          None            None                   None   
17          None          None            None                   None   
12          None          None            None                   None   
10          None          None            None                   None   
13          None          None            None                   None   
0           None          None            None                   None   
11          None          None            None                   None   
5           None          None            None                   None   
2           None          None            None                   None   
7           None          None            None                   None   
3           None          None            None                   None   
14          None          None            None                   None   
4           None          None            None                   None   
19           NaN           NaN             NaN                    NaN   
1           None          None            None                   None   
9           None          None            None                   None   

   one_hot_max_size min_data_in_leaf random_score_type colsample_bylevel  \
16             None             None              None              None   
18             None             None              None              None   
6              None             None              None              None   
8              None             None              None              None   
15             None             None              None              None   
17             None             None              None              None   
12             None             None              None              None   
10             None             None              None              None   
13             None             None              None              None   
0              None             None              None              None   
11             None             None              None              None   
5              None             None              None              None   
2              None             None              None              None   
7              None             None              None              None   
3              None             None              None              None   
14             None             None              None              None   
4              None             None              None              None   
19              NaN              NaN               NaN               NaN   
1              None             None              None              None   
9              None             None              None              None   

   bootstrap_type custom_metric thread_count bagging_temperature  \
16           None          None         None                None   
18           None          None         None                None   
6            None          None         None                None   
8            None          None         None                None   
15           None          None         None                None   
17           None          None         None                None   
12           None          None         None                None   
10           None          None         None                None   
13           None          None         None                None   
0            None          None         None                None   
11           None          None         None                None   
5            None          None         None                None   
2            None          None         None                None   
7            None          None         None                None   
3            None          None         None                None   
14           None          None         None                None   
4            None          None         None                None   
19            NaN           NaN          NaN                 NaN   
1            None          None         None                None   
9            None          None         None                None   

   random_strength nan_mode text_features per_float_feature_quantization  \
16            None     None          None                           None   
18            None     None          None                           None   
6             None     None          None                           None   
8             None     None          None                           None   
15            None     None          None                           None   
17            None     None          None                           None   
12            None     None          None                           None   
10            None     None          None                           None   
13            None     None          None                           None   
0             None     None          None                           None   
11            None     None          None                           None   
5             None     None          None                           None   
2             None     None          None                           None   
7             None     None          None                           None   
3             None     None          None                           None   
14            None     None          None                           None   
4             None     None          None                           None   
19             NaN      NaN           NaN                            NaN   
1             None     None          None                           None   
9             None     None          None                           None   

   simple_ctr output_borders use_best_model gpu_cat_features_storage  \
16       None           None           None                     None   
18       None           None           None                     None   
6        None           None           None                     None   
8        None           None           None                     None   
15       None           None           None                     None   
17       None           None           None                     None   
12       None           None           None                     None   
10       None           None           None                     None   
13       None           None           None                     None   
0        None           None           None                     None   
11       None           None           None                     None   
5        None           None           None                     None   
2        None           None           None                     None   
7        None           None           None                     None   
3        None           None           None                     None   
14       None           None           None                     None   
4        None           None           None                     None   
19        NaN            NaN            NaN                      NaN   
1        None           None           None                     None   
9        None           None           None                     None   

   combinations_ctr border_count feature_border_type data_partition  \
16             None         None                None           None   
18             None         None                None           None   
6              None         None                None           None   
8              None         None                None           None   
15             None         None                None           None   
17             None         None                None           None   
12             None         None                None           None   
10             None         None                None           None   
13             None         None                None           None   
0              None         None                None           None   
11             None         None                None           None   
5              None         None                None           None   
2              None         None                None           None   
7              None         None                None           None   
3              None         None                None           None   
14             None         None                None           None   
4              None         None                None           None   
19              NaN          NaN                 NaN            NaN   
1              None         None                None           None   
9              None         None                None           None   

   fold_permutation_block od_pval  name early_stopping_rounds tokenizers  \
16                   None    None  None                  None       None   
18                   None    None  None                  None       None   
6                    None    None  None                  None       None   
8                    None    None  None                  None       None   
15                   None    None  None                  None       None   
17                   None    None  None                  None       None   
12                   None    None  None                  None       None   
10                   None    None  None                  None       None   
13                   None    None  None                  None       None   
0                    None    None  None                  None       None   
11                   None    None  None                  None       None   
5                    None    None  None                  None       None   
2                    None    None  None                  None       None   
7                    None    None  None                  None       None   
3                    None    None  None                  None       None   
14                   None    None  None                  None       None   
4                    None    None  None                  None       None   
19                    NaN     NaN   NaN                   NaN        NaN   
1                    None    None  None                  None       None   
9                    None    None  None                  None       None   

   best_model_min_trees dev_efb_max_buckets feature_weights  \
16                 None                None            None   
18                 None                None            None   
6                  None                None            None   
8                  None                None            None   
15                 None                None            None   
17                 None                None            None   
12                 None                None            None   
10                 None                None            None   
13                 None                None            None   
0                  None                None            None   
11                 None                None            None   
5                  None                None            None   
2                  None                None            None   
7                  None                None            None   
3                  None                None            None   
14                 None                None            None   
4                  None                None            None   
19                  NaN                 NaN             NaN   
1                  None                None            None   
9                  None                None            None   

   posterior_sampling metadata boosting_type diffusion_temperature  \
16               None     None          None                  None   
18               None     None          None                  None   
6                None     None          None                  None   
8                None     None          None                  None   
15               None     None          None                  None   
17               None     None          None                  None   
12               None     None          None                  None   
10               None     None          None                  None   
13               None     None          None                  None   
0                None     None          None                  None   
11               None     None          None                  None   
5                None     None          None                  None   
2                None     None          None                  None   
7                None     None          None                  None   
3                None     None          None                  None   
14               None     None          None                  None   
4                None     None          None                  None   
19                NaN      NaN           NaN                   NaN   
1                None     None          None                  None   
9                None     None          None                  None   

   gpu_ram_part score_function approx_on_full_history sampling_unit task_type  \
16         None           None                   None          None      None   
18         None           None                   None          None      None   
6          None           None                   None          None      None   
8          None           None                   None          None      None   
15         None           None                   None          None      None   
17         None           None                   None          None      None   
12         None           None                   None          None      None   
10         None           None                   None          None      None   
13         None           None                   None          None      None   
0          None           None                   None          None      None   
11         None           None                   None          None      None   
5          None           None                   None          None      None   
2          None           None                   None          None      None   
7          None           None                   None          None      None   
3          None           None                   None          None      None   
14         None           None                   None          None      None   
4          None           None                   None          None      None   
19          NaN            NaN                    NaN           NaN       NaN   
1          None           None                   None          None      None   
9          None           None                   None          None      None   

   snapshot_interval   rsm store_all_simple_ctr random_seed  \
16              None  None                 None        None   
18              None  None                 None        None   
6               None  None                 None        None   
8               None  None                 None        None   
15              None  None                 None        None   
17              None  None                 None        None   
12              None  None                 None        None   
10              None  None                 None        None   
13              None  None                 None        None   
0               None  None                 None        None   
11              None  None                 None        None   
5               None  None                 None        None   
2               None  None                 None        None   
7               None  None                 None        None   
3               None  None                 None        None   
14              None  None                 None        None   
4               None  None                 None        None   
19               NaN   NaN                  NaN         NaN   
1               None  None                 None        None   
9               None  None                 None        None   

   sampling_frequency ctr_target_border_count final_ctr_computation_mode  \
16               None                    None                       None   
18               None                    None                       None   
6                None                    None                       None   
8                None                    None                       None   
15               None                    None                       None   
17               None                    None                       None   
12               None                    None                       None   
10               None                    None                       None   
13               None                    None                       None   
0                None                    None                       None   
11               None                    None                       None   
5                None                    None                       None   
2                None                    None                       None   
7                None                    None                       None   
3                None                    None                       None   
14               None                    None                       None   
4                None                    None                       None   
19                NaN                     NaN                        NaN   
1                None                    None                       None   
9                None                    None                       None   

   fixed_binary_splits auto_class_weights ctr_history_unit device_config  \
16                None               None             None          None   
18                None               None             None          None   
6                 None               None             None          None   
8                 None               None             None          None   
15                None               None             None          None   
17                None               None             None          None   
12                None               None             None          None   
10                None               None             None          None   
13                None               None             None          None   
0                 None               None             None          None   
11                None               None             None          None   
5                 None               None             None          None   
2                 None               None             None          None   
7                 None               None             None          None   
3                 None               None             None          None   
14                None               None             None          None   
4                 None               None             None          None   
19                 NaN                NaN              NaN           NaN   
1                 None               None             None          None   
9                 None               None             None          None   

   leaf_estimation_backtracking has_time fold_len_multiplier  \
16                         None     None                None   
18                         None     None                None   
6                          None     None                None   
8                          None     None                None   
15                         None     None                None   
17                         None     None                None   
12                         None     None                None   
10                         None     None                None   
13                         None     None                None   
0                          None     None                None   
11                         None     None                None   
5                          None     None                None   
2                          None     None                None   
7                          None     None                None   
3                          None     None                None   
14                         None     None                None   
4                          None     None                None   
19                          NaN      NaN                 NaN   
1                          None     None                None   
9                          None     None                None   

   pinned_memory_size feature_calcers model_shrink_rate od_type  \
16               None            None              None    None   
18               None            None              None    None   
6                None            None              None    None   
8                None            None              None    None   
15               None            None              None    None   
17               None            None              None    None   
12               None            None              None    None   
10               None            None              None    None   
13               None            None              None    None   
0                None            None              None    None   
11               None            None              None    None   
5                None            None              None    None   
2                None            None              None    None   
7                None            None              None    None   
3                None            None              None    None   
14               None            None              None    None   
4                None            None              None    None   
19                NaN             NaN               NaN     NaN   
1                None            None              None    None   
9                None            None              None    None   

   monotone_constraints dictionaries max_bin boost_from_average grow_policy  \
16                 None         None    None               None        None   
18                 None         None    None               None        None   
6                  None         None    None               None        None   
8                  None         None    None               None        None   
15                 None         None    None               None        None   
17                 None         None    None               None        None   
12                 None         None    None               None        None   
10                 None         None    None               None        None   
13                 None         None    None               None        None   
0                  None         None    None               None        None   
11                 None         None    None               None        None   
5                  None         None    None               None        None   
2                  None         None    None               None        None   
7                  None         None    None               None        None   
3                  None         None    None               None        None   
14                 None         None    None               None        None   
4                  None         None    None               None        None   
19                  NaN          NaN     NaN                NaN         NaN   
1                  None         None    None               None        None   
9                  None         None    None               None        None   

   embedding_features langevin callback cat_features train_dir  \
16               None     None     None         None      None   
18               None     None     None         None      None   
6                None     None     None         None      None   
8                None     None     None         None      None   
15               None     None     None         None      None   
17               None     None     None         None      None   
12               None     None     None         None      None   
10               None     None     None         None      None   
13               None     None     None         None      None   
0                None     None     None         None      None   
11               None     None     None         None      None   
5                None     None     None         None      None   
2                None     None     None         None      None   
7                None     None     None         None      None   
3                None     None     None         None      None   
14               None     None     None         None      None   
4                None     None     None         None      None   
19                NaN      NaN      NaN          NaN       NaN   
1                None     None     None         None      None   
9                None     None     None         None      None   

   sparse_features_conflict_fraction ignored_features num_trees  \
16                              None             None      None   
18                              None             None      None   
6                               None             None      None   
8                               None             None      None   
15                              None             None      None   
17                              None             None      None   
12                              None             None      None   
10                              None             None      None   
13                              None             None      None   
0                               None             None      None   
11                              None             None      None   
5                               None             None      None   
2                               None             None      None   
7                               None             None      None   
3                               None             None      None   
14                              None             None      None   
4                               None             None      None   
19                               NaN              NaN       NaN   
1                               None             None      None   
9                               None             None      None   

   penalties_coefficient objective used_ram_limit text_processing reg_lambda  \
16                  None      None           None            None       None   
18                  None      None           None            None       None   
6                   None      None           None            None       None   
8                   None      None           None            None       None   
15                  None      None           None            None       None   
17                  None      None           None            None       None   
12                  None      None           None            None       None   
10                  None      None           None            None       None   
13                  None      None           None            None       None   
0                   None      None           None            None       None   
11                  None      None           None            None       None   
5                   None      None           None            None       None   
2                   None      None           None            None       None   
7                   None      None           None            None       None   
3                   None      None           None            None       None   
14                  None      None           None            None       None   
4                   None      None           None            None       None   
19                   NaN      None            NaN             NaN        NaN   
1                   None      None           None            None       None   
9                   None      None           None            None       None   

   snapshot_file random_state custom_loss loss_function  \
16          None         None        None          None   
18          None         None        None          None   
6           None         None        None          None   
8           None         None        None          None   
15          None         None        None          None   
17          None         None        None          None   
12          None         None        None          None   
10          None         None        None          None   
13          None         None        None          None   
0           None         None        None          None   
11          None         None        None          None   
5           None         None        None          None   
2           None         None        None          None   
7           None         None        None          None   
3           None         None        None          None   
14          None         None        None          None   
4           None         None        None          None   
19           NaN          NaN         NaN           NaN   
1           None         None        None          None   
9           None         None        None          None   

   leaf_estimation_iterations silent max_leaves input_borders  \
16                       None   None       None          None   
18                       None   None       None          None   
6                        None   None       None          None   
8                        None   None       None          None   
15                       None   None       None          None   
17                       None   None       None          None   
12                       None   None       None          None   
10                       None   None       None          None   
13                       None   None       None          None   
0                        None   None       None          None   
11                       None   None       None          None   
5                        None   None       None          None   
2                        None   None       None          None   
7                        None   None       None          None   
3                        None   None       None          None   
14                       None   None       None          None   
4                        None   None       None          None   
19                        NaN    NaN        NaN           NaN   
1                        None   None       None          None   
9                        None   None       None          None   

   counter_calc_method num_boost_round model_size_reg eval_metric num_leaves  \
16                None            None           None        None       None   
18                None            None           None        None       None   
6                 None            None           None        None       None   
8                 None            None           None        None       None   
15                None            None           None        None       None   
17                None            None           None        None       None   
12                None            None           None        None       None   
10                None            None           None        None       None   
13                None            None           None        None       None   
0                 None            None           None        None       None   
11                None            None           None        None       None   
5                 None            None           None        None       None   
2                 None            None           None        None       None   
7                 None            None           None        None       None   
3                 None            None           None        None       None   
14                None            None           None        None       None   
4                 None            None           None        None       None   
19                 NaN             NaN            NaN         NaN        NaN   
1                 None            None           None        None       None   
9                 None            None           None        None       None   

   min_child_samples class_names logging_level first_feature_use_penalties  \
16              None        None          None                        None   
18              None        None          None                        None   
6               None        None          None                        None   
8               None        None          None                        None   
15              None        None          None                        None   
17              None        None          None                        None   
12              None        None          None                        None   
10              None        None          None                        None   
13              None        None          None                        None   
0               None        None          None                        None   
11              None        None          None                        None   
5               None        None          None                        None   
2               None        None          None                        None   
7               None        None          None                        None   
3               None        None          None                        None   
14              None        None          None                        None   
4               None        None          None                        None   
19               NaN         NaN           NaN                         NaN   
1               None        None          None                        None   
9               None        None          None                        None   

   od_wait kwargs  
16    None    NaN  
18    None    NaN  
6     None    NaN  
8     None    NaN  
15    None    NaN  
17    None    NaN  
12    None    NaN  
10    None    NaN  
13    None    NaN  
0     None    NaN  
11    None    NaN  
5     None    NaN  
2     None    NaN  
7     None    NaN  
3     None    NaN  
14    None    NaN  
4     None    NaN  
19     NaN   None  
1     None    NaN  
9     None    NaN  

✅ Resultados guardados en 'resultados_adasyn.csv'
CPU times: user 15min 44s, sys: 11.9 s, total: 15min 56s
Wall time: 10min 12s

2. Validación Cruzada con cross_val_score (con SMOTE)¶

La validación cruzada permite evaluar el modelo en múltiples subconjuntos del dataset para reducir la varianza de los resultados y evitar sobreajuste.

In [ ]:
# ==========================================================
# 2. Validación Cruzada con cross_val_score (con SMOTE)
# ==========================================================
%%time

from imblearn.over_sampling import SMOTE
from sklearn.model_selection import StratifiedKFold
from catboost import CatBoostClassifier
from xgboost import XGBClassifier
import pandas as pd

# Datos originales
X = X_train  # Usar datos originales de entrenamiento
y = y_train  # Etiquetas de entrenamiento

# Configuraciones de hiperparámetros
parametros_catboost = {
    'iterations': 300,
    'learning_rate': 0.03,
    'depth': 3,  # Reduce la profundidad del árbol
    'l2_leaf_reg': 15.0,  # Aumenta el valor para regularización L2
    'class_weights': [1, 10],
    'verbose': 50
}

parametros_xgb = {
    'learning_rate': 0.03,
    'max_depth': 3,  # Árboles más superficiales
    'n_estimators': 300,
    'scale_pos_weight': 10,
    'subsample': 0.8,
    'gamma': 1.0,  # Penalización para ramas innecesarias
    'min_child_weight': 5
}

# Inicializar DataFrame para almacenar resultados
columnas_resultados = [
    'Modelo', 'Tecnica', 'Sobreajuste',
    'Precision_Train', 'Recall_Train', 'AUPRC_Train', 'Accuracy_Train', 'F1-Score_Train',
    'Precision_Test', 'Recall_Test', 'AUPRC_Test', 'Accuracy_Test', 'F1-Score_Test',
    'iterations', 'learning_rate', 'depth', 'class_weights', 'verbose',
    'max_depth', 'n_estimators', 'scale_pos_weight', 'min_child_weight',
    'gamma', 'l2_leaf_reg', 'subsample', 'Fold'
]
resultados_validacion_cruzada_con_smote = pd.DataFrame(columns=columnas_resultados)

# Estrategia de validación cruzada
kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=42)

# ================================================
# Evaluación con CatBoost
# ================================================
print("\n🚀 Evaluación con CatBoost usando SMOTE en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔄 Fold {fold}:")

    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Aplicar SMOTE en el conjunto de entrenamiento
    print("📊 Aplicando SMOTE en el conjunto de entrenamiento...")
    smote = SMOTE(sampling_strategy='auto', random_state=42)
    X_train_res, y_train_res = smote.fit_resample(X_train_fold, y_train_fold)

    # Mostrar distribución después de SMOTE
    print("🔍 Distribución DESPUÉS de SMOTE en fold:")
    print(y_train_res.value_counts(normalize=True))

    # Entrenar y evaluar el modelo
    resultados_validacion_cruzada_con_smote = entrenar_y_evaluar(
        modelo=CatBoostClassifier,
        nombre_modelo="CatBoost",
        parametros=parametros_catboost,
        X_train=X_train_res,
        y_train=y_train_res,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada con SMOTE",
        umbral_sobreajuste=10,  # Aplicar el umbral ajustado
        resultados_df=resultados_validacion_cruzada_con_smote
    )
    resultados_validacion_cruzada_con_smote['Fold'] = fold  # Añadir número de fold

# ================================================
# Evaluación con XGBoost
# ================================================
print("\n🚀 Evaluación con XGBoost usando SMOTE en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔄 Fold {fold}:")

    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Aplicar SMOTE en el conjunto de entrenamiento
    print("📊 Aplicando SMOTE en el conjunto de entrenamiento...")
    smote = SMOTE(sampling_strategy='auto', random_state=42)
    X_train_res, y_train_res = smote.fit_resample(X_train_fold, y_train_fold)

    # Mostrar distribución después de SMOTE
    print("🔍 Distribución DESPUÉS de SMOTE en fold:")
    print(y_train_res.value_counts(normalize=True))

    # Entrenar y evaluar el modelo
    resultados_validacion_cruzada_con_smote = entrenar_y_evaluar(
        modelo=XGBClassifier,
        nombre_modelo="XGBoost",
        parametros=parametros_xgb,
        X_train=X_train_res,
        y_train=y_train_res,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada con SMOTE",
        umbral_sobreajuste=10,  # Aplicar el umbral ajustado
        resultados_df=resultados_validacion_cruzada_con_smote
    )
    resultados_validacion_cruzada_con_smote['Fold'] = fold  # Añadir número de fold

# ================================================
# Mostrar y Ordenar Resultados
# ================================================
resultados_ordenados_smote = resultados_validacion_cruzada_con_smote.sort_values(
    by=['AUPRC_Test', 'Recall_Test', 'Precision_Test', 'F1-Score_Test'],
    ascending=[False, False, False, False]
)

print("\n🏆 Resultados Finales Ordenados (Top Modelos):")
print(resultados_ordenados_smote)

# Guardar los resultados en CSV
output_file = "resultados_validacion_cruzada_smote.csv"
resultados_ordenados_smote.to_csv(output_file, index=False)
print(f"\n✅ Resultados guardados en '{output_file}'")
🚀 Evaluación con CatBoost usando SMOTE en cada fold...

🔄 Fold 1:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6197643	total: 172ms	remaining: 51.5s
50:	learn: 0.0837850	total: 4.59s	remaining: 22.4s
100:	learn: 0.0570631	total: 8.8s	remaining: 17.3s
150:	learn: 0.0452330	total: 15.1s	remaining: 14.9s
200:	learn: 0.0380934	total: 19.6s	remaining: 9.66s
250:	learn: 0.0329013	total: 23.8s	remaining: 4.65s
299:	learn: 0.0291132	total: 29.8s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.87
 - Recall_Train: 99.97
 - AUPRC_Train: 99.80
 - Accuracy_Train: 96.72
 - F1-Score_Train: 96.82
 - Precision_Test: 2.48
 - Recall_Test: 94.74
 - AUPRC_Test: 67.68
 - Accuracy_Test: 93.69
 - F1-Score_Test: 4.83
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 135)

🔄 Fold 2:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6198680	total: 86.7ms	remaining: 25.9s
50:	learn: 0.0873865	total: 4.34s	remaining: 21.2s
100:	learn: 0.0587008	total: 8.75s	remaining: 17.2s
150:	learn: 0.0454690	total: 15.2s	remaining: 15s
200:	learn: 0.0373740	total: 19.4s	remaining: 9.55s
250:	learn: 0.0314223	total: 23.5s	remaining: 4.59s
299:	learn: 0.0273196	total: 30s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.32
 - Recall_Train: 99.98
 - AUPRC_Train: 99.83
 - Accuracy_Train: 96.98
 - F1-Score_Train: 97.07
 - Precision_Test: 2.42
 - Recall_Test: 92.11
 - AUPRC_Test: 60.34
 - Accuracy_Test: 93.71
 - F1-Score_Test: 4.72
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 135)

🔄 Fold 3:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6179122	total: 84.5ms	remaining: 25.3s
50:	learn: 0.0755571	total: 4.35s	remaining: 21.2s
100:	learn: 0.0504525	total: 10.4s	remaining: 20.5s
150:	learn: 0.0395327	total: 15.2s	remaining: 15s
200:	learn: 0.0331644	total: 19.4s	remaining: 9.55s
250:	learn: 0.0285134	total: 25.3s	remaining: 4.94s
299:	learn: 0.0250930	total: 30.1s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.50
 - Recall_Train: 99.98
 - AUPRC_Train: 99.83
 - Accuracy_Train: 97.08
 - F1-Score_Train: 97.16
 - Precision_Test: 2.42
 - Recall_Test: 86.84
 - AUPRC_Test: 63.91
 - Accuracy_Test: 94.07
 - F1-Score_Test: 4.72
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 135)

🔄 Fold 4:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6198310	total: 88.8ms	remaining: 26.5s
50:	learn: 0.0886550	total: 4.38s	remaining: 21.4s
100:	learn: 0.0602755	total: 11s	remaining: 21.7s
150:	learn: 0.0481493	total: 15.2s	remaining: 15s
200:	learn: 0.0405259	total: 19.4s	remaining: 9.53s
250:	learn: 0.0348938	total: 25.9s	remaining: 5.06s
299:	learn: 0.0306786	total: 30s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.69
 - Recall_Train: 99.98
 - AUPRC_Train: 99.81
 - Accuracy_Train: 96.62
 - F1-Score_Train: 96.73
 - Precision_Test: 2.36
 - Recall_Test: 97.37
 - AUPRC_Test: 71.12
 - Accuracy_Test: 93.17
 - F1-Score_Test: 4.60
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (4, 135)

🔄 Fold 5:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6209804	total: 97.5ms	remaining: 29.1s
50:	learn: 0.0879092	total: 6.18s	remaining: 30.2s
100:	learn: 0.0595503	total: 10.9s	remaining: 21.4s
150:	learn: 0.0480335	total: 15s	remaining: 14.8s
200:	learn: 0.0405637	total: 20.7s	remaining: 10.2s
250:	learn: 0.0349176	total: 25.7s	remaining: 5.03s
299:	learn: 0.0306724	total: 29.9s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.68
 - Recall_Train: 99.97
 - AUPRC_Train: 99.81
 - Accuracy_Train: 96.61
 - F1-Score_Train: 96.72
 - Precision_Test: 2.30
 - Recall_Test: 94.74
 - AUPRC_Test: 62.59
 - Accuracy_Test: 93.20
 - F1-Score_Test: 4.50
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (5, 135)

🔄 Fold 6:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6227778	total: 97.2ms	remaining: 29.1s
50:	learn: 0.0884626	total: 6.61s	remaining: 32.3s
100:	learn: 0.0578288	total: 10.9s	remaining: 21.4s
150:	learn: 0.0455781	total: 15.1s	remaining: 14.9s
200:	learn: 0.0381560	total: 21.8s	remaining: 10.7s
250:	learn: 0.0323976	total: 25.9s	remaining: 5.06s
299:	learn: 0.0282838	total: 30s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.12
 - Recall_Train: 99.98
 - AUPRC_Train: 99.83
 - Accuracy_Train: 96.87
 - F1-Score_Train: 96.96
 - Precision_Test: 2.52
 - Recall_Test: 94.74
 - AUPRC_Test: 70.72
 - Accuracy_Test: 93.80
 - F1-Score_Test: 4.91
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (6, 135)

🔄 Fold 7:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6203246	total: 170ms	remaining: 50.8s
50:	learn: 0.0881837	total: 5.43s	remaining: 26.5s
100:	learn: 0.0606040	total: 9.64s	remaining: 19s
150:	learn: 0.0484906	total: 15.1s	remaining: 14.9s
200:	learn: 0.0410733	total: 22.8s	remaining: 11.2s
250:	learn: 0.0355084	total: 27s	remaining: 5.26s
299:	learn: 0.0311114	total: 31.1s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.53
 - Recall_Train: 99.96
 - AUPRC_Train: 99.79
 - Accuracy_Train: 96.52
 - F1-Score_Train: 96.64
 - Precision_Test: 2.31
 - Recall_Test: 94.74
 - AUPRC_Test: 81.38
 - Accuracy_Test: 93.22
 - F1-Score_Test: 4.51
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (7, 135)

🔄 Fold 8:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6226187	total: 163ms	remaining: 48.9s
50:	learn: 0.0854899	total: 5.02s	remaining: 24.5s
100:	learn: 0.0587468	total: 10s	remaining: 19.7s
150:	learn: 0.0462800	total: 18.1s	remaining: 17.9s
200:	learn: 0.0388473	total: 22.3s	remaining: 11s
250:	learn: 0.0332201	total: 26.4s	remaining: 5.16s
299:	learn: 0.0292881	total: 33s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.15
 - Recall_Train: 99.97
 - AUPRC_Train: 99.83
 - Accuracy_Train: 96.88
 - F1-Score_Train: 96.97
 - Precision_Test: 2.52
 - Recall_Test: 94.74
 - AUPRC_Test: 70.05
 - Accuracy_Test: 93.80
 - F1-Score_Test: 4.91
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (8, 135)

🔄 Fold 9:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6205673	total: 85.1ms	remaining: 25.5s
50:	learn: 0.0895558	total: 4.32s	remaining: 21.1s
100:	learn: 0.0607633	total: 9.37s	remaining: 18.5s
150:	learn: 0.0478809	total: 15s	remaining: 14.8s
200:	learn: 0.0404560	total: 19.1s	remaining: 9.43s
250:	learn: 0.0347237	total: 23.9s	remaining: 4.66s
299:	learn: 0.0307483	total: 29.8s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.68
 - Recall_Train: 99.96
 - AUPRC_Train: 99.78
 - Accuracy_Train: 96.60
 - F1-Score_Train: 96.71
 - Precision_Test: 2.23
 - Recall_Test: 97.30
 - AUPRC_Test: 78.07
 - Accuracy_Test: 92.96
 - F1-Score_Test: 4.36
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (9, 135)

🔄 Fold 10:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Validación Cruzada con SMOTE)...
0:	learn: 0.6207747	total: 83.5ms	remaining: 25s
50:	learn: 0.0776857	total: 4.25s	remaining: 20.8s
100:	learn: 0.0530933	total: 10.7s	remaining: 21.2s
150:	learn: 0.0419197	total: 14.9s	remaining: 14.7s
200:	learn: 0.0347746	total: 19.1s	remaining: 9.4s
250:	learn: 0.0297048	total: 25.5s	remaining: 4.98s
299:	learn: 0.0261034	total: 29.8s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.72
 - Recall_Train: 99.90
 - AUPRC_Train: 99.81
 - Accuracy_Train: 97.17
 - F1-Score_Train: 97.24
 - Precision_Test: 2.34
 - Recall_Test: 83.78
 - AUPRC_Test: 54.96
 - Accuracy_Test: 94.23
 - F1-Score_Test: 4.56
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 3
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 15.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (10, 135)

🚀 Evaluación con XGBoost usando SMOTE en cada fold...

🔄 Fold 1:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.98
 - Recall_Train: 99.92
 - AUPRC_Train: 99.89
 - Accuracy_Train: 97.32
 - F1-Score_Train: 97.39
 - Precision_Test: 3.13
 - Recall_Test: 94.74
 - AUPRC_Test: 77.93
 - Accuracy_Test: 95.04
 - F1-Score_Test: 6.07
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (11, 136)

🔄 Fold 2:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.23
 - Recall_Train: 99.99
 - AUPRC_Train: 99.91
 - Accuracy_Train: 97.49
 - F1-Score_Train: 97.55
 - Precision_Test: 2.76
 - Recall_Test: 89.47
 - AUPRC_Test: 67.44
 - Accuracy_Test: 94.66
 - F1-Score_Test: 5.36
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (12, 136)

🔄 Fold 3:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.23
 - Recall_Train: 99.99
 - AUPRC_Train: 99.91
 - Accuracy_Train: 97.49
 - F1-Score_Train: 97.55
 - Precision_Test: 2.93
 - Recall_Test: 89.47
 - AUPRC_Test: 71.50
 - Accuracy_Test: 94.97
 - F1-Score_Test: 5.67
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (13, 136)

🔄 Fold 4:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.49
 - Recall_Train: 99.94
 - AUPRC_Train: 99.89
 - Accuracy_Train: 97.05
 - F1-Score_Train: 97.14
 - Precision_Test: 2.68
 - Recall_Test: 97.37
 - AUPRC_Test: 77.59
 - Accuracy_Test: 94.02
 - F1-Score_Test: 5.22
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (14, 136)

🔄 Fold 5:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.92
 - Recall_Train: 99.98
 - AUPRC_Train: 99.89
 - Accuracy_Train: 97.31
 - F1-Score_Train: 97.38
 - Precision_Test: 2.84
 - Recall_Test: 94.74
 - AUPRC_Test: 61.09
 - Accuracy_Test: 94.52
 - F1-Score_Test: 5.52
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (15, 136)

🔄 Fold 6:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.89
 - Recall_Train: 99.91
 - AUPRC_Train: 99.91
 - Accuracy_Train: 97.27
 - F1-Score_Train: 97.34
 - Precision_Test: 2.91
 - Recall_Test: 94.74
 - AUPRC_Test: 78.19
 - Accuracy_Test: 94.64
 - F1-Score_Test: 5.64
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (16, 136)

🔄 Fold 7:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.76
 - Recall_Train: 99.98
 - AUPRC_Train: 99.88
 - Accuracy_Train: 97.23
 - F1-Score_Train: 97.30
 - Precision_Test: 2.85
 - Recall_Test: 94.74
 - AUPRC_Test: 88.19
 - Accuracy_Test: 94.53
 - F1-Score_Test: 5.53
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (17, 136)

🔄 Fold 8:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.11
 - Recall_Train: 99.94
 - AUPRC_Train: 99.89
 - Accuracy_Train: 97.40
 - F1-Score_Train: 97.47
 - Precision_Test: 3.05
 - Recall_Test: 94.74
 - AUPRC_Test: 66.82
 - Accuracy_Test: 94.90
 - F1-Score_Test: 5.91
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (18, 136)

🔄 Fold 9:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.59
 - Recall_Train: 99.94
 - AUPRC_Train: 99.87
 - Accuracy_Train: 97.11
 - F1-Score_Train: 97.19
 - Precision_Test: 2.72
 - Recall_Test: 100.00
 - AUPRC_Test: 84.40
 - Accuracy_Test: 94.10
 - F1-Score_Test: 5.29
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (19, 136)

🔄 Fold 10:
📊 Aplicando SMOTE en el conjunto de entrenamiento...
🔍 Distribución DESPUÉS de SMOTE en fold:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Validación Cruzada con SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.49
 - Recall_Train: 99.92
 - AUPRC_Train: 99.91
 - Accuracy_Train: 97.60
 - F1-Score_Train: 97.65
 - Precision_Test: 2.82
 - Recall_Test: 83.78
 - AUPRC_Test: 65.16
 - Accuracy_Test: 95.21
 - F1-Score_Test: 5.45
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (20, 136)

🏆 Resultados Finales Ordenados (Top Modelos):
      Modelo                       Tecnica Sobreajuste  Precision_Train  \
16   XGBoost  Validación Cruzada con SMOTE           1      94.76261114   
18   XGBoost  Validación Cruzada con SMOTE           1      94.59012244   
6   CatBoost  Validación Cruzada con SMOTE           1      93.52864034   
15   XGBoost  Validación Cruzada con SMOTE           1      94.89387924   
8   CatBoost  Validación Cruzada con SMOTE           1      93.67709653   
10   XGBoost  Validación Cruzada con SMOTE           1      94.97641443   
13   XGBoost  Validación Cruzada con SMOTE           1      94.48622375   
12   XGBoost  Validación Cruzada con SMOTE           1      95.22670780   
3   CatBoost  Validación Cruzada con SMOTE           1      93.68708317   
5   CatBoost  Validación Cruzada con SMOTE           1      94.11679690   
7   CatBoost  Validación Cruzada con SMOTE           1      94.14895354   
0   CatBoost  Validación Cruzada con SMOTE           1      93.86611482   
11   XGBoost  Validación Cruzada con SMOTE           1      95.22677422   
17   XGBoost  Validación Cruzada con SMOTE           1      95.11092677   
19   XGBoost  Validación Cruzada con SMOTE           1      95.48541599   
2   CatBoost  Validación Cruzada con SMOTE           1      94.49939844   
4   CatBoost  Validación Cruzada con SMOTE           1      93.67646718   
14   XGBoost  Validación Cruzada con SMOTE           1      94.91566866   
1   CatBoost  Validación Cruzada con SMOTE           1      94.32083653   
9   CatBoost  Validación Cruzada con SMOTE           1      94.72352016   

    Recall_Train  AUPRC_Train  Accuracy_Train  F1-Score_Train  Precision_Test  \
16   99.98167464  99.88237549     97.22791794     97.30220855      2.85035629   
18   99.94304280  99.86959059     97.11350827     97.19293514      2.71659325   
6    99.96334928  99.79403119     96.52338217     96.63899948      2.31065469   
15   99.91431656  99.90857611     97.26902618     97.33940660      2.90791599   
8    99.95592008  99.77748689     96.60460808     96.71470845      2.22772277   
10   99.92125048  99.88541548     97.31805890     97.38610363      3.13315927   
13   99.93809000  99.89404267     97.05308411     97.13571879      2.68310370   
12   99.99356136  99.90907674     97.49066397     97.55193649      2.92850991   
3    99.97771240  99.80735199     96.62045704     96.73023152      2.35668790   
5    99.98365576  99.82986452     96.86685884     96.96156081      2.52277505   
7    99.96978792  99.82811916     96.87849791     96.97209911      2.52100840   
0    99.96978792  99.79652302     96.71852248     96.82185249      2.47933884   
11   99.98514160  99.90586186     97.48670173     97.54796437      2.76198213   
17   99.94452864  99.89164278     97.40349470     97.46783780      3.05084746   
19   99.92471744  99.90529428     97.60012085     97.65464098      2.81562216   
2    99.97870296  99.82816188     97.07958159     97.16186263      2.42468773   
4    99.96731152  99.80694122     96.60956088     96.71970501      2.30473752   
14   99.97721712  99.88944374     97.31087734     97.38071634      2.84360190   
1    99.98365576  99.82593432     96.98176379     97.06972741      2.42047026   
9    99.90242984  99.80877857     97.16873198     97.24407076      2.34493192   

    Recall_Test  AUPRC_Test  Accuracy_Test  F1-Score_Test iterations  \
16  94.73684211 88.19208953    94.53097188     5.53420446        NaN   
18 100.00000000 84.40388489    94.10351119     5.28949249        NaN   
6   94.73684211 81.38025890    93.21822713     4.51127820        300   
15  94.73684211 78.18604093    94.64222143     5.64263323        NaN   
8   97.29729730 78.06690064    92.96426505     4.35571688        300   
10  94.73684211 77.92951787    95.03826985     6.06571188        NaN   
13  97.36842105 77.59308652    94.02367391     5.22230064        NaN   
12  89.47368421 71.49855691    94.96707013     5.67139283        NaN   
3   97.36842105 71.12010920    93.17372731     4.60199005        300   
5   94.73684211 70.72422839    93.80117480     4.91467577        300   
7   94.73684211 70.04893483    93.79672481     4.91132333        300   
0   94.73684211 67.68355898    93.68992524     4.83221477        300   
11  89.47368421 67.44480844    94.65557138     5.35855004        NaN   
17  94.73684211 66.82054784    94.90032040     5.91133005        NaN   
19  83.78378378 65.15597638    95.21160607     5.44815466        NaN   
2   86.84210526 63.90756485    94.06817373     4.71765547        300   
4   94.73684211 62.58593364    93.20042720     4.50000000        300   
14  94.73684211 61.08541061    94.51762193     5.52147239        NaN   
1   92.10526316 60.34089979    93.70772517     4.71698113        300   
9   83.78378378 54.96252475    94.22811624     4.56217807        300   

    learning_rate depth class_weights verbose max_depth n_estimators  \
16            NaN   NaN           NaN     NaN      None         None   
18            NaN   NaN           NaN     NaN      None         None   
6      0.03000000     3       [1, 10]      50      None         None   
15            NaN   NaN           NaN     NaN      None         None   
8      0.03000000     3       [1, 10]      50      None         None   
10            NaN   NaN           NaN     NaN      None         None   
13            NaN   NaN           NaN     NaN      None         None   
12            NaN   NaN           NaN     NaN      None         None   
3      0.03000000     3       [1, 10]      50      None         None   
5      0.03000000     3       [1, 10]      50      None         None   
7      0.03000000     3       [1, 10]      50      None         None   
0      0.03000000     3       [1, 10]      50      None         None   
11            NaN   NaN           NaN     NaN      None         None   
17            NaN   NaN           NaN     NaN      None         None   
19            NaN   NaN           NaN     NaN       NaN          NaN   
2      0.03000000     3       [1, 10]      50      None         None   
4      0.03000000     3       [1, 10]      50      None         None   
14            NaN   NaN           NaN     NaN      None         None   
1      0.03000000     3       [1, 10]      50      None         None   
9      0.03000000     3       [1, 10]      50      None         None   

   scale_pos_weight min_child_weight gamma  l2_leaf_reg subsample  Fold  \
16             None              NaN   NaN          NaN      None    10   
18             None              NaN   NaN          NaN      None    10   
6              None              NaN   NaN  15.00000000      None    10   
15             None              NaN   NaN          NaN      None    10   
8              None              NaN   NaN  15.00000000      None    10   
10             None              NaN   NaN          NaN      None    10   
13             None              NaN   NaN          NaN      None    10   
12             None              NaN   NaN          NaN      None    10   
3              None              NaN   NaN  15.00000000      None    10   
5              None              NaN   NaN  15.00000000      None    10   
7              None              NaN   NaN  15.00000000      None    10   
0              None              NaN   NaN  15.00000000      None    10   
11             None              NaN   NaN          NaN      None    10   
17             None              NaN   NaN          NaN      None    10   
19              NaN              NaN   NaN          NaN       NaN    10   
2              None              NaN   NaN  15.00000000      None    10   
4              None              NaN   NaN  15.00000000      None    10   
14             None              NaN   NaN          NaN      None    10   
1              None              NaN   NaN  15.00000000      None    10   
9              None              NaN   NaN  15.00000000      None    10   

   model_shrink_mode per_feature_ctr   eta devices  \
16              None            None  None    None   
18              None            None  None    None   
6               None            None  None    None   
15              None            None  None    None   
8               None            None  None    None   
10              None            None  None    None   
13              None            None  None    None   
12              None            None  None    None   
3               None            None  None    None   
5               None            None  None    None   
7               None            None  None    None   
0               None            None  None    None   
11              None            None  None    None   
17              None            None  None    None   
19               NaN             NaN   NaN     NaN   
2               None            None  None    None   
4               None            None  None    None   
14              None            None  None    None   
1               None            None  None    None   
9               None            None  None    None   

   per_object_feature_penalties allow_const_label mvs_reg  \
16                         None              None    None   
18                         None              None    None   
6                          None              None    None   
15                         None              None    None   
8                          None              None    None   
10                         None              None    None   
13                         None              None    None   
12                         None              None    None   
3                          None              None    None   
5                          None              None    None   
7                          None              None    None   
0                          None              None    None   
11                         None              None    None   
17                         None              None    None   
19                          NaN               NaN     NaN   
2                          None              None    None   
4                          None              None    None   
14                         None              None    None   
1                          None              None    None   
9                          None              None    None   

   dev_score_calc_obj_block_size ctr_leaf_count_limit max_ctr_complexity  \
16                          None                 None               None   
18                          None                 None               None   
6                           None                 None               None   
15                          None                 None               None   
8                           None                 None               None   
10                          None                 None               None   
13                          None                 None               None   
12                          None                 None               None   
3                           None                 None               None   
5                           None                 None               None   
7                           None                 None               None   
0                           None                 None               None   
11                          None                 None               None   
17                          None                 None               None   
19                           NaN                  NaN                NaN   
2                           None                 None               None   
4                           None                 None               None   
14                          None                 None               None   
1                           None                 None               None   
9                           None                 None               None   

   target_border metric_period eval_fraction allow_writing_files  \
16          None          None          None                None   
18          None          None          None                None   
6           None          None          None                None   
15          None          None          None                None   
8           None          None          None                None   
10          None          None          None                None   
13          None          None          None                None   
12          None          None          None                None   
3           None          None          None                None   
5           None          None          None                None   
7           None          None          None                None   
0           None          None          None                None   
11          None          None          None                None   
17          None          None          None                None   
19           NaN           NaN           NaN                 NaN   
2           None          None          None                None   
4           None          None          None                None   
14          None          None          None                None   
1           None          None          None                None   
9           None          None          None                None   

   save_snapshot classes_count ctr_description leaf_estimation_method  \
16          None          None            None                   None   
18          None          None            None                   None   
6           None          None            None                   None   
15          None          None            None                   None   
8           None          None            None                   None   
10          None          None            None                   None   
13          None          None            None                   None   
12          None          None            None                   None   
3           None          None            None                   None   
5           None          None            None                   None   
7           None          None            None                   None   
0           None          None            None                   None   
11          None          None            None                   None   
17          None          None            None                   None   
19           NaN           NaN             NaN                    NaN   
2           None          None            None                   None   
4           None          None            None                   None   
14          None          None            None                   None   
1           None          None            None                   None   
9           None          None            None                   None   

   one_hot_max_size min_data_in_leaf random_score_type colsample_bylevel  \
16             None             None              None              None   
18             None             None              None              None   
6              None             None              None              None   
15             None             None              None              None   
8              None             None              None              None   
10             None             None              None              None   
13             None             None              None              None   
12             None             None              None              None   
3              None             None              None              None   
5              None             None              None              None   
7              None             None              None              None   
0              None             None              None              None   
11             None             None              None              None   
17             None             None              None              None   
19              NaN              NaN               NaN               NaN   
2              None             None              None              None   
4              None             None              None              None   
14             None             None              None              None   
1              None             None              None              None   
9              None             None              None              None   

   bootstrap_type custom_metric thread_count bagging_temperature  \
16           None          None         None                None   
18           None          None         None                None   
6            None          None         None                None   
15           None          None         None                None   
8            None          None         None                None   
10           None          None         None                None   
13           None          None         None                None   
12           None          None         None                None   
3            None          None         None                None   
5            None          None         None                None   
7            None          None         None                None   
0            None          None         None                None   
11           None          None         None                None   
17           None          None         None                None   
19            NaN           NaN          NaN                 NaN   
2            None          None         None                None   
4            None          None         None                None   
14           None          None         None                None   
1            None          None         None                None   
9            None          None         None                None   

   random_strength nan_mode text_features per_float_feature_quantization  \
16            None     None          None                           None   
18            None     None          None                           None   
6             None     None          None                           None   
15            None     None          None                           None   
8             None     None          None                           None   
10            None     None          None                           None   
13            None     None          None                           None   
12            None     None          None                           None   
3             None     None          None                           None   
5             None     None          None                           None   
7             None     None          None                           None   
0             None     None          None                           None   
11            None     None          None                           None   
17            None     None          None                           None   
19             NaN      NaN           NaN                            NaN   
2             None     None          None                           None   
4             None     None          None                           None   
14            None     None          None                           None   
1             None     None          None                           None   
9             None     None          None                           None   

   simple_ctr output_borders use_best_model gpu_cat_features_storage  \
16       None           None           None                     None   
18       None           None           None                     None   
6        None           None           None                     None   
15       None           None           None                     None   
8        None           None           None                     None   
10       None           None           None                     None   
13       None           None           None                     None   
12       None           None           None                     None   
3        None           None           None                     None   
5        None           None           None                     None   
7        None           None           None                     None   
0        None           None           None                     None   
11       None           None           None                     None   
17       None           None           None                     None   
19        NaN            NaN            NaN                      NaN   
2        None           None           None                     None   
4        None           None           None                     None   
14       None           None           None                     None   
1        None           None           None                     None   
9        None           None           None                     None   

   combinations_ctr border_count feature_border_type data_partition  \
16             None         None                None           None   
18             None         None                None           None   
6              None         None                None           None   
15             None         None                None           None   
8              None         None                None           None   
10             None         None                None           None   
13             None         None                None           None   
12             None         None                None           None   
3              None         None                None           None   
5              None         None                None           None   
7              None         None                None           None   
0              None         None                None           None   
11             None         None                None           None   
17             None         None                None           None   
19              NaN          NaN                 NaN            NaN   
2              None         None                None           None   
4              None         None                None           None   
14             None         None                None           None   
1              None         None                None           None   
9              None         None                None           None   

   fold_permutation_block od_pval  name early_stopping_rounds tokenizers  \
16                   None    None  None                  None       None   
18                   None    None  None                  None       None   
6                    None    None  None                  None       None   
15                   None    None  None                  None       None   
8                    None    None  None                  None       None   
10                   None    None  None                  None       None   
13                   None    None  None                  None       None   
12                   None    None  None                  None       None   
3                    None    None  None                  None       None   
5                    None    None  None                  None       None   
7                    None    None  None                  None       None   
0                    None    None  None                  None       None   
11                   None    None  None                  None       None   
17                   None    None  None                  None       None   
19                    NaN     NaN   NaN                   NaN        NaN   
2                    None    None  None                  None       None   
4                    None    None  None                  None       None   
14                   None    None  None                  None       None   
1                    None    None  None                  None       None   
9                    None    None  None                  None       None   

   best_model_min_trees dev_efb_max_buckets feature_weights  \
16                 None                None            None   
18                 None                None            None   
6                  None                None            None   
15                 None                None            None   
8                  None                None            None   
10                 None                None            None   
13                 None                None            None   
12                 None                None            None   
3                  None                None            None   
5                  None                None            None   
7                  None                None            None   
0                  None                None            None   
11                 None                None            None   
17                 None                None            None   
19                  NaN                 NaN             NaN   
2                  None                None            None   
4                  None                None            None   
14                 None                None            None   
1                  None                None            None   
9                  None                None            None   

   posterior_sampling metadata boosting_type diffusion_temperature  \
16               None     None          None                  None   
18               None     None          None                  None   
6                None     None          None                  None   
15               None     None          None                  None   
8                None     None          None                  None   
10               None     None          None                  None   
13               None     None          None                  None   
12               None     None          None                  None   
3                None     None          None                  None   
5                None     None          None                  None   
7                None     None          None                  None   
0                None     None          None                  None   
11               None     None          None                  None   
17               None     None          None                  None   
19                NaN      NaN           NaN                   NaN   
2                None     None          None                  None   
4                None     None          None                  None   
14               None     None          None                  None   
1                None     None          None                  None   
9                None     None          None                  None   

   gpu_ram_part score_function approx_on_full_history sampling_unit task_type  \
16         None           None                   None          None      None   
18         None           None                   None          None      None   
6          None           None                   None          None      None   
15         None           None                   None          None      None   
8          None           None                   None          None      None   
10         None           None                   None          None      None   
13         None           None                   None          None      None   
12         None           None                   None          None      None   
3          None           None                   None          None      None   
5          None           None                   None          None      None   
7          None           None                   None          None      None   
0          None           None                   None          None      None   
11         None           None                   None          None      None   
17         None           None                   None          None      None   
19          NaN            NaN                    NaN           NaN       NaN   
2          None           None                   None          None      None   
4          None           None                   None          None      None   
14         None           None                   None          None      None   
1          None           None                   None          None      None   
9          None           None                   None          None      None   

   snapshot_interval   rsm store_all_simple_ctr random_seed  \
16              None  None                 None        None   
18              None  None                 None        None   
6               None  None                 None        None   
15              None  None                 None        None   
8               None  None                 None        None   
10              None  None                 None        None   
13              None  None                 None        None   
12              None  None                 None        None   
3               None  None                 None        None   
5               None  None                 None        None   
7               None  None                 None        None   
0               None  None                 None        None   
11              None  None                 None        None   
17              None  None                 None        None   
19               NaN   NaN                  NaN         NaN   
2               None  None                 None        None   
4               None  None                 None        None   
14              None  None                 None        None   
1               None  None                 None        None   
9               None  None                 None        None   

   sampling_frequency ctr_target_border_count final_ctr_computation_mode  \
16               None                    None                       None   
18               None                    None                       None   
6                None                    None                       None   
15               None                    None                       None   
8                None                    None                       None   
10               None                    None                       None   
13               None                    None                       None   
12               None                    None                       None   
3                None                    None                       None   
5                None                    None                       None   
7                None                    None                       None   
0                None                    None                       None   
11               None                    None                       None   
17               None                    None                       None   
19                NaN                     NaN                        NaN   
2                None                    None                       None   
4                None                    None                       None   
14               None                    None                       None   
1                None                    None                       None   
9                None                    None                       None   

   fixed_binary_splits auto_class_weights ctr_history_unit device_config  \
16                None               None             None          None   
18                None               None             None          None   
6                 None               None             None          None   
15                None               None             None          None   
8                 None               None             None          None   
10                None               None             None          None   
13                None               None             None          None   
12                None               None             None          None   
3                 None               None             None          None   
5                 None               None             None          None   
7                 None               None             None          None   
0                 None               None             None          None   
11                None               None             None          None   
17                None               None             None          None   
19                 NaN                NaN              NaN           NaN   
2                 None               None             None          None   
4                 None               None             None          None   
14                None               None             None          None   
1                 None               None             None          None   
9                 None               None             None          None   

   leaf_estimation_backtracking has_time fold_len_multiplier  \
16                         None     None                None   
18                         None     None                None   
6                          None     None                None   
15                         None     None                None   
8                          None     None                None   
10                         None     None                None   
13                         None     None                None   
12                         None     None                None   
3                          None     None                None   
5                          None     None                None   
7                          None     None                None   
0                          None     None                None   
11                         None     None                None   
17                         None     None                None   
19                          NaN      NaN                 NaN   
2                          None     None                None   
4                          None     None                None   
14                         None     None                None   
1                          None     None                None   
9                          None     None                None   

   pinned_memory_size feature_calcers model_shrink_rate od_type  \
16               None            None              None    None   
18               None            None              None    None   
6                None            None              None    None   
15               None            None              None    None   
8                None            None              None    None   
10               None            None              None    None   
13               None            None              None    None   
12               None            None              None    None   
3                None            None              None    None   
5                None            None              None    None   
7                None            None              None    None   
0                None            None              None    None   
11               None            None              None    None   
17               None            None              None    None   
19                NaN             NaN               NaN     NaN   
2                None            None              None    None   
4                None            None              None    None   
14               None            None              None    None   
1                None            None              None    None   
9                None            None              None    None   

   monotone_constraints dictionaries max_bin boost_from_average grow_policy  \
16                 None         None    None               None        None   
18                 None         None    None               None        None   
6                  None         None    None               None        None   
15                 None         None    None               None        None   
8                  None         None    None               None        None   
10                 None         None    None               None        None   
13                 None         None    None               None        None   
12                 None         None    None               None        None   
3                  None         None    None               None        None   
5                  None         None    None               None        None   
7                  None         None    None               None        None   
0                  None         None    None               None        None   
11                 None         None    None               None        None   
17                 None         None    None               None        None   
19                  NaN          NaN     NaN                NaN         NaN   
2                  None         None    None               None        None   
4                  None         None    None               None        None   
14                 None         None    None               None        None   
1                  None         None    None               None        None   
9                  None         None    None               None        None   

   embedding_features langevin callback cat_features train_dir  \
16               None     None     None         None      None   
18               None     None     None         None      None   
6                None     None     None         None      None   
15               None     None     None         None      None   
8                None     None     None         None      None   
10               None     None     None         None      None   
13               None     None     None         None      None   
12               None     None     None         None      None   
3                None     None     None         None      None   
5                None     None     None         None      None   
7                None     None     None         None      None   
0                None     None     None         None      None   
11               None     None     None         None      None   
17               None     None     None         None      None   
19                NaN      NaN      NaN          NaN       NaN   
2                None     None     None         None      None   
4                None     None     None         None      None   
14               None     None     None         None      None   
1                None     None     None         None      None   
9                None     None     None         None      None   

   sparse_features_conflict_fraction ignored_features num_trees  \
16                              None             None      None   
18                              None             None      None   
6                               None             None      None   
15                              None             None      None   
8                               None             None      None   
10                              None             None      None   
13                              None             None      None   
12                              None             None      None   
3                               None             None      None   
5                               None             None      None   
7                               None             None      None   
0                               None             None      None   
11                              None             None      None   
17                              None             None      None   
19                               NaN              NaN       NaN   
2                               None             None      None   
4                               None             None      None   
14                              None             None      None   
1                               None             None      None   
9                               None             None      None   

   penalties_coefficient objective used_ram_limit text_processing reg_lambda  \
16                  None      None           None            None       None   
18                  None      None           None            None       None   
6                   None      None           None            None       None   
15                  None      None           None            None       None   
8                   None      None           None            None       None   
10                  None      None           None            None       None   
13                  None      None           None            None       None   
12                  None      None           None            None       None   
3                   None      None           None            None       None   
5                   None      None           None            None       None   
7                   None      None           None            None       None   
0                   None      None           None            None       None   
11                  None      None           None            None       None   
17                  None      None           None            None       None   
19                   NaN      None            NaN             NaN        NaN   
2                   None      None           None            None       None   
4                   None      None           None            None       None   
14                  None      None           None            None       None   
1                   None      None           None            None       None   
9                   None      None           None            None       None   

   snapshot_file random_state custom_loss loss_function  \
16          None         None        None          None   
18          None         None        None          None   
6           None         None        None          None   
15          None         None        None          None   
8           None         None        None          None   
10          None         None        None          None   
13          None         None        None          None   
12          None         None        None          None   
3           None         None        None          None   
5           None         None        None          None   
7           None         None        None          None   
0           None         None        None          None   
11          None         None        None          None   
17          None         None        None          None   
19           NaN          NaN         NaN           NaN   
2           None         None        None          None   
4           None         None        None          None   
14          None         None        None          None   
1           None         None        None          None   
9           None         None        None          None   

   leaf_estimation_iterations silent max_leaves input_borders  \
16                       None   None       None          None   
18                       None   None       None          None   
6                        None   None       None          None   
15                       None   None       None          None   
8                        None   None       None          None   
10                       None   None       None          None   
13                       None   None       None          None   
12                       None   None       None          None   
3                        None   None       None          None   
5                        None   None       None          None   
7                        None   None       None          None   
0                        None   None       None          None   
11                       None   None       None          None   
17                       None   None       None          None   
19                        NaN    NaN        NaN           NaN   
2                        None   None       None          None   
4                        None   None       None          None   
14                       None   None       None          None   
1                        None   None       None          None   
9                        None   None       None          None   

   counter_calc_method num_boost_round model_size_reg eval_metric num_leaves  \
16                None            None           None        None       None   
18                None            None           None        None       None   
6                 None            None           None        None       None   
15                None            None           None        None       None   
8                 None            None           None        None       None   
10                None            None           None        None       None   
13                None            None           None        None       None   
12                None            None           None        None       None   
3                 None            None           None        None       None   
5                 None            None           None        None       None   
7                 None            None           None        None       None   
0                 None            None           None        None       None   
11                None            None           None        None       None   
17                None            None           None        None       None   
19                 NaN             NaN            NaN         NaN        NaN   
2                 None            None           None        None       None   
4                 None            None           None        None       None   
14                None            None           None        None       None   
1                 None            None           None        None       None   
9                 None            None           None        None       None   

   min_child_samples class_names logging_level first_feature_use_penalties  \
16              None        None          None                        None   
18              None        None          None                        None   
6               None        None          None                        None   
15              None        None          None                        None   
8               None        None          None                        None   
10              None        None          None                        None   
13              None        None          None                        None   
12              None        None          None                        None   
3               None        None          None                        None   
5               None        None          None                        None   
7               None        None          None                        None   
0               None        None          None                        None   
11              None        None          None                        None   
17              None        None          None                        None   
19               NaN         NaN           NaN                         NaN   
2               None        None          None                        None   
4               None        None          None                        None   
14              None        None          None                        None   
1               None        None          None                        None   
9               None        None          None                        None   

   od_wait kwargs  
16    None    NaN  
18    None    NaN  
6     None    NaN  
15    None    NaN  
8     None    NaN  
10    None    NaN  
13    None    NaN  
12    None    NaN  
3     None    NaN  
5     None    NaN  
7     None    NaN  
0     None    NaN  
11    None    NaN  
17    None    NaN  
19     NaN   None  
2     None    NaN  
4     None    NaN  
14    None    NaN  
1     None    NaN  
9     None    NaN  

✅ Resultados guardados en 'resultados_validacion_cruzada_smote.csv'
CPU times: user 13min 59s, sys: 10.7 s, total: 14min 10s
Wall time: 8min 57s

3. Validación Cruzada con cross_val_score (SIN SMOTE)¶

La validación cruzada permite evaluar el modelo en múltiples subconjuntos del dataset para reducir la varianza de los resultados y evitar sobreajuste.

SMOTE genera puntos sintéticos que pueden facilitar que el modelo memorice patrones artificiales en lugar de generalizar, por esto probamos la validación cruzada sobre en dataset original (clean_data).

Evaluar el modelo sin SMOTE permite analizar cómo se desempeña con los datos originales.

In [ ]:
# ==========================================================
# 3. Validación Cruzada SIN SMOTE
# ==========================================================
%%time

from sklearn.model_selection import StratifiedKFold
from catboost import CatBoostClassifier
from xgboost import XGBClassifier
import pandas as pd

# Configuraciones de hiperparámetros
parametros_catboost = {
    'iterations': 300,
    'learning_rate': 0.03,
    'depth': 4,
    'l2_leaf_reg': 10.0,  # Mayor regularización
    'class_weights': [1, 10],  # Balanceo de clases nativo
    'verbose': 50
}

parametros_xgb = {
    'learning_rate': 0.03,
    'max_depth': 4,
    'n_estimators': 300,
    'min_child_weight': 2,  # Evita árboles muy profundos
    'scale_pos_weight': 10,  # Balanceo de clases
    'gamma': 0.2,  # Regularización adicional
    'subsample': 0.8  # Submuestra del 80%
}

# Datos originales
X = X_train  # Usar datos originales de entrenamiento
y = y_train  # Etiquetas de entrenamiento

# Estrategia de validación cruzada
kfold = StratifiedKFold(n_splits=10, shuffle=True, random_state=42)

# Lista completa de columnas esperadas en resultados_maestro
columnas_resultados = [
    'Modelo', 'Tecnica', 'Sobreajuste',
    'Precision_Train', 'Recall_Train', 'AUPRC_Train', 'Accuracy_Train', 'F1-Score_Train',
    'Precision_Test', 'Recall_Test', 'AUPRC_Test', 'Accuracy_Test', 'F1-Score_Test',
    'iterations', 'learning_rate', 'depth', 'class_weights', 'verbose',
    'max_depth', 'n_estimators', 'scale_pos_weight', 'min_child_weight',
    'gamma', 'l2_leaf_reg', 'subsample', 'Fold'
]

# Inicializar DataFrame para almacenar resultados
resultados_validacion_cruzada_sin_smote = pd.DataFrame(columns=columnas_resultados)

# ================================================
# Evaluación con CatBoost SIN SMOTE
# ================================================
print("\n🚀 Evaluación con CatBoost SIN SMOTE en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔎 Fold {fold}/{kfold.get_n_splits()} - Entrenando CatBoost...")

    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Entrenar y evaluar el modelo
    resultados_validacion_cruzada_sin_smote = entrenar_y_evaluar(
        modelo=CatBoostClassifier,
        nombre_modelo="CatBoost",
        parametros=parametros_catboost,
        X_train=X_train_fold,
        y_train=y_train_fold,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada SIN SMOTE",
        umbral_sobreajuste=10,  # Aplicar umbral ajustado
        resultados_df=resultados_validacion_cruzada_sin_smote
    )
    resultados_validacion_cruzada_sin_smote.loc[resultados_validacion_cruzada_sin_smote.index[-1], 'Fold'] = fold

# ================================================
# Evaluación con XGBoost SIN SMOTE
# ================================================
print("\n🚀 Evaluación con XGBoost SIN SMOTE en cada fold...")
for fold, (train_idx, test_idx) in enumerate(kfold.split(X, y), 1):
    print(f"\n🔎 Fold {fold}/{kfold.get_n_splits()} - Entrenando XGBoost...")

    # Dividir el fold en entrenamiento y prueba
    X_train_fold, X_test_fold = X.iloc[train_idx], X.iloc[test_idx]
    y_train_fold, y_test_fold = y.iloc[train_idx], y.iloc[test_idx]

    # Entrenar y evaluar el modelo
    resultados_validacion_cruzada_sin_smote = entrenar_y_evaluar(
        modelo=XGBClassifier,
        nombre_modelo="XGBoost",
        parametros=parametros_xgb,
        X_train=X_train_fold,
        y_train=y_train_fold,
        X_test=X_test_fold,
        y_test=y_test_fold,
        tecnica="Validación Cruzada SIN SMOTE",
        umbral_sobreajuste=10,  # Aplicar umbral ajustado
        resultados_df=resultados_validacion_cruzada_sin_smote
    )
    resultados_validacion_cruzada_sin_smote.loc[resultados_validacion_cruzada_sin_smote.index[-1], 'Fold'] = fold

# ================================================
# Mostrar y Ordenar Resultados
# ================================================
resultados_ordenados_sin_smote = resultados_validacion_cruzada_sin_smote.sort_values(
    by=['AUPRC_Test', 'Recall_Test', 'Precision_Test', 'F1-Score_Test'],
    ascending=[False, False, False, False]
)

print("\n🏆 Resultados Finales Ordenados (Top Modelos):")
print(resultados_ordenados_sin_smote)

# Guardar los resultados en CSV
output_file = "resultados_validacion_cruzada_sin_smote.csv"
resultados_ordenados_sin_smote.to_csv(output_file, index=False)
print(f"\n✅ Resultados guardados en '{output_file}'")
🚀 Evaluación con CatBoost SIN SMOTE en cada fold...

🔎 Fold 1/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6054052	total: 46.6ms	remaining: 13.9s
50:	learn: 0.0203590	total: 2.49s	remaining: 12.2s
100:	learn: 0.0149727	total: 4.89s	remaining: 9.63s
150:	learn: 0.0131578	total: 7.27s	remaining: 7.17s
200:	learn: 0.0119017	total: 11.6s	remaining: 5.73s
250:	learn: 0.0109190	total: 14.9s	remaining: 2.9s
299:	learn: 0.0099803	total: 17.2s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 82.02
 - Recall_Train: 88.53
 - AUPRC_Train: 90.30
 - Accuracy_Train: 99.95
 - F1-Score_Train: 85.15
 - Precision_Test: 79.49
 - Recall_Test: 81.58
 - AUPRC_Test: 86.04
 - Accuracy_Test: 99.93
 - F1-Score_Test: 80.52
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 135)

🔎 Fold 2/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6041546	total: 46.2ms	remaining: 13.8s
50:	learn: 0.0199121	total: 2.45s	remaining: 11.9s
100:	learn: 0.0144349	total: 4.97s	remaining: 9.8s
150:	learn: 0.0125894	total: 9.51s	remaining: 9.39s
200:	learn: 0.0113453	total: 12.3s	remaining: 6.07s
250:	learn: 0.0104474	total: 14.7s	remaining: 2.87s
299:	learn: 0.0096953	total: 17.1s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 84.31
 - Recall_Train: 88.53
 - AUPRC_Train: 90.45
 - Accuracy_Train: 99.95
 - F1-Score_Train: 86.37
 - Precision_Test: 83.33
 - Recall_Test: 78.95
 - AUPRC_Test: 81.39
 - Accuracy_Test: 99.94
 - F1-Score_Test: 81.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 135)

🔎 Fold 3/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6024564	total: 46.9ms	remaining: 14s
50:	learn: 0.0196367	total: 3.01s	remaining: 14.7s
100:	learn: 0.0140014	total: 7.3s	remaining: 14.4s
150:	learn: 0.0122469	total: 9.54s	remaining: 9.42s
200:	learn: 0.0109673	total: 11.8s	remaining: 5.82s
250:	learn: 0.0099498	total: 14.1s	remaining: 2.75s
299:	learn: 0.0090632	total: 16.3s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 82.34
 - Recall_Train: 89.12
 - AUPRC_Train: 90.97
 - Accuracy_Train: 99.95
 - F1-Score_Train: 85.59
 - Precision_Test: 86.49
 - Recall_Test: 84.21
 - AUPRC_Test: 82.43
 - Accuracy_Test: 99.95
 - F1-Score_Test: 85.33
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 135)

🔎 Fold 4/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6034793	total: 103ms	remaining: 30.7s
50:	learn: 0.0202511	total: 4.66s	remaining: 22.8s
100:	learn: 0.0145433	total: 6.93s	remaining: 13.7s
150:	learn: 0.0130246	total: 9.15s	remaining: 9.03s
200:	learn: 0.0118574	total: 11.4s	remaining: 5.62s
250:	learn: 0.0109478	total: 13.7s	remaining: 2.67s
299:	learn: 0.0100997	total: 17.3s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 82.47
 - Recall_Train: 88.53
 - AUPRC_Train: 90.10
 - Accuracy_Train: 99.95
 - F1-Score_Train: 85.39
 - Precision_Test: 83.33
 - Recall_Test: 78.95
 - AUPRC_Test: 82.11
 - Accuracy_Test: 99.94
 - F1-Score_Test: 81.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (4, 135)

🔎 Fold 5/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6053334	total: 45.1ms	remaining: 13.5s
50:	learn: 0.0201140	total: 2.35s	remaining: 11.5s
100:	learn: 0.0146940	total: 4.61s	remaining: 9.08s
150:	learn: 0.0128819	total: 6.88s	remaining: 6.79s
200:	learn: 0.0117039	total: 9.13s	remaining: 4.5s
250:	learn: 0.0107222	total: 12.9s	remaining: 2.52s
299:	learn: 0.0098954	total: 16.1s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 84.87
 - Recall_Train: 89.12
 - AUPRC_Train: 90.34
 - Accuracy_Train: 99.96
 - F1-Score_Train: 86.94
 - Precision_Test: 65.96
 - Recall_Test: 81.58
 - AUPRC_Test: 77.20
 - Accuracy_Test: 99.90
 - F1-Score_Test: 72.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (5, 135)

🔎 Fold 6/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6047216	total: 46.6ms	remaining: 13.9s
50:	learn: 0.0205041	total: 2.35s	remaining: 11.5s
100:	learn: 0.0151121	total: 4.62s	remaining: 9.1s
150:	learn: 0.0133615	total: 6.91s	remaining: 6.82s
200:	learn: 0.0121349	total: 11.2s	remaining: 5.5s
250:	learn: 0.0111330	total: 14.1s	remaining: 2.76s
299:	learn: 0.0102100	total: 16.4s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 81.92
 - Recall_Train: 87.94
 - AUPRC_Train: 89.82
 - Accuracy_Train: 99.95
 - F1-Score_Train: 84.82
 - Precision_Test: 80.49
 - Recall_Test: 86.84
 - AUPRC_Test: 87.71
 - Accuracy_Test: 99.94
 - F1-Score_Test: 83.54
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (6, 135)

🔎 Fold 7/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6044617	total: 44.2ms	remaining: 13.2s
50:	learn: 0.0201969	total: 2.33s	remaining: 11.4s
100:	learn: 0.0149730	total: 4.6s	remaining: 9.07s
150:	learn: 0.0131648	total: 9.1s	remaining: 8.97s
200:	learn: 0.0119928	total: 11.8s	remaining: 5.83s
250:	learn: 0.0110760	total: 14.2s	remaining: 2.76s
299:	learn: 0.0102029	total: 16.4s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 83.15
 - Recall_Train: 88.53
 - AUPRC_Train: 89.64
 - Accuracy_Train: 99.95
 - F1-Score_Train: 85.75
 - Precision_Test: 94.29
 - Recall_Test: 86.84
 - AUPRC_Test: 87.96
 - Accuracy_Test: 99.97
 - F1-Score_Test: 90.41
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (7, 135)

🔎 Fold 8/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6048882	total: 47.7ms	remaining: 14.3s
50:	learn: 0.0199723	total: 2.56s	remaining: 12.5s
100:	learn: 0.0147744	total: 7.2s	remaining: 14.2s
150:	learn: 0.0131513	total: 9.51s	remaining: 9.39s
200:	learn: 0.0119062	total: 11.8s	remaining: 5.82s
250:	learn: 0.0109846	total: 14.1s	remaining: 2.76s
299:	learn: 0.0100997	total: 16.4s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 82.92
 - Recall_Train: 88.53
 - AUPRC_Train: 89.94
 - Accuracy_Train: 99.95
 - F1-Score_Train: 85.63
 - Precision_Test: 78.05
 - Recall_Test: 84.21
 - AUPRC_Test: 83.72
 - Accuracy_Test: 99.93
 - F1-Score_Test: 81.01
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (8, 135)

🔎 Fold 9/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6056109	total: 100ms	remaining: 30s
50:	learn: 0.0207826	total: 4.72s	remaining: 23s
100:	learn: 0.0156416	total: 7.01s	remaining: 13.8s
150:	learn: 0.0137892	total: 9.28s	remaining: 9.15s
200:	learn: 0.0125089	total: 11.6s	remaining: 5.71s
250:	learn: 0.0114481	total: 13.9s	remaining: 2.71s
299:	learn: 0.0106068	total: 17.6s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 82.27
 - Recall_Train: 87.10
 - AUPRC_Train: 89.23
 - Accuracy_Train: 99.95
 - F1-Score_Train: 84.62
 - Precision_Test: 91.67
 - Recall_Test: 89.19
 - AUPRC_Test: 92.67
 - Accuracy_Test: 99.97
 - F1-Score_Test: 90.41
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (9, 135)

🔎 Fold 10/10 - Entrenando CatBoost...

🚀 Entrenando CatBoost (Validación Cruzada SIN SMOTE)...
0:	learn: 0.6044235	total: 97.5ms	remaining: 29.2s
50:	learn: 0.0196157	total: 2.44s	remaining: 11.9s
100:	learn: 0.0140260	total: 4.73s	remaining: 9.31s
150:	learn: 0.0123771	total: 7.01s	remaining: 6.92s
200:	learn: 0.0111960	total: 9.31s	remaining: 4.59s
250:	learn: 0.0102205	total: 13s	remaining: 2.54s
299:	learn: 0.0093952	total: 16.5s	remaining: 0us

✅ Resultados para CatBoost (Validación Cruzada SIN SMOTE):
 - Modelo: CatBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 83.66
 - Recall_Train: 88.56
 - AUPRC_Train: 91.11
 - Accuracy_Train: 99.95
 - F1-Score_Train: 86.04
 - Precision_Test: 73.17
 - Recall_Test: 81.08
 - AUPRC_Test: 76.25
 - Accuracy_Test: 99.92
 - F1-Score_Test: 76.92
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: 50
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 300
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: 10.00
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 10]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (10, 135)

🚀 Evaluación con XGBoost SIN SMOTE en cada fold...

🔎 Fold 1/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.38
 - Recall_Train: 96.76
 - AUPRC_Train: 98.48
 - Accuracy_Train: 99.98
 - F1-Score_Train: 93.47
 - Precision_Test: 86.49
 - Recall_Test: 84.21
 - AUPRC_Test: 87.94
 - Accuracy_Test: 99.95
 - F1-Score_Test: 85.33
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (11, 136)

🔎 Fold 2/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 91.67
 - Recall_Train: 97.06
 - AUPRC_Train: 98.73
 - Accuracy_Train: 99.98
 - F1-Score_Train: 94.29
 - Precision_Test: 90.62
 - Recall_Test: 76.32
 - AUPRC_Test: 81.86
 - Accuracy_Test: 99.95
 - F1-Score_Test: 82.86
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (12, 136)

🔎 Fold 3/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 88.53
 - Recall_Train: 97.65
 - AUPRC_Train: 98.13
 - Accuracy_Train: 99.97
 - F1-Score_Train: 92.87
 - Precision_Test: 91.43
 - Recall_Test: 84.21
 - AUPRC_Test: 83.62
 - Accuracy_Test: 99.96
 - F1-Score_Test: 87.67
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (13, 136)

🔎 Fold 4/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 91.11
 - Recall_Train: 96.47
 - AUPRC_Train: 98.31
 - Accuracy_Train: 99.98
 - F1-Score_Train: 93.71
 - Precision_Test: 87.10
 - Recall_Test: 71.05
 - AUPRC_Test: 83.01
 - Accuracy_Test: 99.93
 - F1-Score_Test: 78.26
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (14, 136)

🔎 Fold 5/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 91.69
 - Recall_Train: 97.35
 - AUPRC_Train: 98.74
 - Accuracy_Train: 99.98
 - F1-Score_Train: 94.44
 - Precision_Test: 75.61
 - Recall_Test: 81.58
 - AUPRC_Test: 79.57
 - Accuracy_Test: 99.92
 - F1-Score_Test: 78.48
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (15, 136)

🔎 Fold 6/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 91.36
 - Recall_Train: 96.47
 - AUPRC_Train: 98.40
 - Accuracy_Train: 99.98
 - F1-Score_Train: 93.85
 - Precision_Test: 84.62
 - Recall_Test: 86.84
 - AUPRC_Test: 89.80
 - Accuracy_Test: 99.95
 - F1-Score_Test: 85.71
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (16, 136)

🔎 Fold 7/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.33
 - Recall_Train: 96.18
 - AUPRC_Train: 98.26
 - Accuracy_Train: 99.98
 - F1-Score_Train: 93.16
 - Precision_Test: 96.97
 - Recall_Test: 84.21
 - AUPRC_Test: 87.49
 - Accuracy_Test: 99.97
 - F1-Score_Test: 90.14
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (17, 136)

🔎 Fold 8/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 91.41
 - Recall_Train: 97.06
 - AUPRC_Train: 98.35
 - Accuracy_Train: 99.98
 - F1-Score_Train: 94.15
 - Precision_Test: 83.78
 - Recall_Test: 81.58
 - AUPRC_Test: 85.79
 - Accuracy_Test: 99.94
 - F1-Score_Test: 82.67
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (18, 136)

🔎 Fold 9/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 0
 - Precision_Train: 89.84
 - Recall_Train: 95.89
 - AUPRC_Train: 98.10
 - Accuracy_Train: 99.97
 - F1-Score_Train: 92.77
 - Precision_Test: 94.29
 - Recall_Test: 89.19
 - AUPRC_Test: 95.20
 - Accuracy_Test: 99.97
 - F1-Score_Test: 91.67
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (19, 136)

🔎 Fold 10/10 - Entrenando XGBoost...

🚀 Entrenando XGBoost (Validación Cruzada SIN SMOTE)...

✅ Resultados para XGBoost (Validación Cruzada SIN SMOTE):
 - Modelo: XGBoost
 - Tecnica: Validación Cruzada SIN SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.18
 - Recall_Train: 96.77
 - AUPRC_Train: 98.74
 - Accuracy_Train: 99.98
 - F1-Score_Train: 94.42
 - Precision_Test: 81.08
 - Recall_Test: 81.08
 - AUPRC_Test: 75.75
 - Accuracy_Test: 99.94
 - F1-Score_Test: 81.08
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (20, 136)

🏆 Resultados Finales Ordenados (Top Modelos):
      Modelo                       Tecnica Sobreajuste  Precision_Train  \
18   XGBoost  Validación Cruzada SIN SMOTE           0      89.83516484   
8   CatBoost  Validación Cruzada SIN SMOTE           0      82.27146814   
15   XGBoost  Validación Cruzada SIN SMOTE           0      91.36490251   
6   CatBoost  Validación Cruzada SIN SMOTE           1      83.14917127   
10   XGBoost  Validación Cruzada SIN SMOTE           1      90.38461538   
5   CatBoost  Validación Cruzada SIN SMOTE           0      81.91780822   
16   XGBoost  Validación Cruzada SIN SMOTE           1      90.33149171   
0   CatBoost  Validación Cruzada SIN SMOTE           0      82.01634877   
17   XGBoost  Validación Cruzada SIN SMOTE           1      91.41274238   
7   CatBoost  Validación Cruzada SIN SMOTE           0      82.92011019   
12   XGBoost  Validación Cruzada SIN SMOTE           1      88.53333333   
13   XGBoost  Validación Cruzada SIN SMOTE           1      91.11111111   
2   CatBoost  Validación Cruzada SIN SMOTE           0      82.33695652   
3   CatBoost  Validación Cruzada SIN SMOTE           0      82.46575342   
11   XGBoost  Validación Cruzada SIN SMOTE           1      91.66666667   
1   CatBoost  Validación Cruzada SIN SMOTE           0      84.31372549   
14   XGBoost  Validación Cruzada SIN SMOTE           1      91.68975069   
4   CatBoost  Validación Cruzada SIN SMOTE           1      84.87394958   
9   CatBoost  Validación Cruzada SIN SMOTE           1      83.65650970   
19   XGBoost  Validación Cruzada SIN SMOTE           1      92.17877095   

    Recall_Train  AUPRC_Train  Accuracy_Train  F1-Score_Train  Precision_Test  \
18   95.89442815  98.10093684     99.97478331     92.76595745     94.28571429   
8    87.09677419  89.22671407     99.94659995     84.61538462     91.66666667   
15   96.47058824  98.39878875     99.97873876     93.84835479     84.61538462   
6    88.52941176  89.63845446     99.95055526     85.75498575     94.28571429   
10   96.76470588  98.47720302     99.97725542     93.46590909     86.48648649   
5    87.94117647  89.82168472     99.94709413     84.82269504     80.48780488   
16   96.17647059  98.26474046     99.97626653     93.16239316     96.96969697   
0    88.52941176  90.30398291     99.94808303     85.14851485     79.48717949   
17   97.05882353  98.34685672     99.97972766     94.15121255     83.78378378   
7    88.52941176  89.94478833     99.95006082     85.63300142     78.04878049   
12   97.64705882  98.12995576     99.97478318     92.86713287     91.42857143   
13   96.47058824  98.30992168     99.97824432     93.71428571     87.09677419   
2    89.11764706  90.96548310     99.94956637     85.59322034     86.48648649   
3    88.52941176  90.10039711     99.94907192     85.39007092     83.33333333   
11   97.05882353  98.73492230     99.98022211     94.28571429     90.62500000   
1    88.52941176  90.45336479     99.95302750     86.37015782     83.33333333   
14   97.35294118  98.73549965     99.98071655     94.43651926     75.60975610   
4    89.11764706  90.34495990     99.95500529     86.94404591     65.95744681   
9    88.56304985  91.10534195     99.95154440     86.03988604     73.17073171   
19   96.77419355  98.73992764     99.98071665     94.42060086     81.08108108   

    Recall_Test  AUPRC_Test  Accuracy_Test  F1-Score_Test iterations  \
18  89.18918919 95.20004893    99.97329892    91.66666667        NaN   
8   89.18918919 92.66666056    99.96884874    90.41095890        300   
15  86.84210526 89.80222597    99.95105020    85.71428571        NaN   
6   86.84210526 87.95879951    99.96885012    90.41095890        300   
10  84.21052632 87.93778215    99.95105020    85.33333333        NaN   
5   86.84210526 87.70971563    99.94215023    83.54430380        300   
16  84.21052632 87.49442362    99.96885012    90.14084507        NaN   
0   81.57894737 86.04299367    99.93325027    80.51948052        300   
17  81.57894737 85.79430745    99.94215023    82.66666667        NaN   
7   84.21052632 83.72146354    99.93325027    81.01265823        300   
12  84.21052632 83.62324257    99.95995016    87.67123288        NaN   
13  71.05263158 83.01005491    99.93325027    78.26086957        NaN   
2   84.21052632 82.43458697    99.95105020    85.33333333        300   
3   78.94736842 82.11332010    99.93770025    81.08108108        300   
11  76.31578947 81.86427842    99.94660021    82.85714286        NaN   
1   78.94736842 81.38894519    99.93770025    81.08108108        300   
14  81.57894737 79.57447254    99.92435030    78.48101266        NaN   
4   81.57894737 77.19532879    99.89765041    72.94117647        300   
9   81.08108108 76.25344665    99.91989676    76.92307692        300   
19  81.08108108 75.75449585    99.93769748    81.08108108        NaN   

    learning_rate depth class_weights verbose max_depth n_estimators  \
18            NaN   NaN           NaN     NaN      None         None   
8      0.03000000     4       [1, 10]      50      None         None   
15            NaN   NaN           NaN     NaN      None         None   
6      0.03000000     4       [1, 10]      50      None         None   
10            NaN   NaN           NaN     NaN      None         None   
5      0.03000000     4       [1, 10]      50      None         None   
16            NaN   NaN           NaN     NaN      None         None   
0      0.03000000     4       [1, 10]      50      None         None   
17            NaN   NaN           NaN     NaN      None         None   
7      0.03000000     4       [1, 10]      50      None         None   
12            NaN   NaN           NaN     NaN      None         None   
13            NaN   NaN           NaN     NaN      None         None   
2      0.03000000     4       [1, 10]      50      None         None   
3      0.03000000     4       [1, 10]      50      None         None   
11            NaN   NaN           NaN     NaN      None         None   
1      0.03000000     4       [1, 10]      50      None         None   
14            NaN   NaN           NaN     NaN      None         None   
4      0.03000000     4       [1, 10]      50      None         None   
9      0.03000000     4       [1, 10]      50      None         None   
19            NaN   NaN           NaN     NaN       NaN          NaN   

   scale_pos_weight min_child_weight gamma  l2_leaf_reg subsample Fold  \
18             None              NaN   NaN          NaN      None    9   
8              None              NaN   NaN  10.00000000      None    9   
15             None              NaN   NaN          NaN      None    6   
6              None              NaN   NaN  10.00000000      None    7   
10             None              NaN   NaN          NaN      None    1   
5              None              NaN   NaN  10.00000000      None    6   
16             None              NaN   NaN          NaN      None    7   
0              None              NaN   NaN  10.00000000      None    1   
17             None              NaN   NaN          NaN      None    8   
7              None              NaN   NaN  10.00000000      None    8   
12             None              NaN   NaN          NaN      None    3   
13             None              NaN   NaN          NaN      None    4   
2              None              NaN   NaN  10.00000000      None    3   
3              None              NaN   NaN  10.00000000      None    4   
11             None              NaN   NaN          NaN      None    2   
1              None              NaN   NaN  10.00000000      None    2   
14             None              NaN   NaN          NaN      None    5   
4              None              NaN   NaN  10.00000000      None    5   
9              None              NaN   NaN  10.00000000      None   10   
19              NaN              NaN   NaN          NaN       NaN   10   

   model_shrink_mode per_feature_ctr   eta devices  \
18              None            None  None    None   
8               None            None  None    None   
15              None            None  None    None   
6               None            None  None    None   
10              None            None  None    None   
5               None            None  None    None   
16              None            None  None    None   
0               None            None  None    None   
17              None            None  None    None   
7               None            None  None    None   
12              None            None  None    None   
13              None            None  None    None   
2               None            None  None    None   
3               None            None  None    None   
11              None            None  None    None   
1               None            None  None    None   
14              None            None  None    None   
4               None            None  None    None   
9               None            None  None    None   
19               NaN             NaN   NaN     NaN   

   per_object_feature_penalties allow_const_label mvs_reg  \
18                         None              None    None   
8                          None              None    None   
15                         None              None    None   
6                          None              None    None   
10                         None              None    None   
5                          None              None    None   
16                         None              None    None   
0                          None              None    None   
17                         None              None    None   
7                          None              None    None   
12                         None              None    None   
13                         None              None    None   
2                          None              None    None   
3                          None              None    None   
11                         None              None    None   
1                          None              None    None   
14                         None              None    None   
4                          None              None    None   
9                          None              None    None   
19                          NaN               NaN     NaN   

   dev_score_calc_obj_block_size ctr_leaf_count_limit max_ctr_complexity  \
18                          None                 None               None   
8                           None                 None               None   
15                          None                 None               None   
6                           None                 None               None   
10                          None                 None               None   
5                           None                 None               None   
16                          None                 None               None   
0                           None                 None               None   
17                          None                 None               None   
7                           None                 None               None   
12                          None                 None               None   
13                          None                 None               None   
2                           None                 None               None   
3                           None                 None               None   
11                          None                 None               None   
1                           None                 None               None   
14                          None                 None               None   
4                           None                 None               None   
9                           None                 None               None   
19                           NaN                  NaN                NaN   

   target_border metric_period eval_fraction allow_writing_files  \
18          None          None          None                None   
8           None          None          None                None   
15          None          None          None                None   
6           None          None          None                None   
10          None          None          None                None   
5           None          None          None                None   
16          None          None          None                None   
0           None          None          None                None   
17          None          None          None                None   
7           None          None          None                None   
12          None          None          None                None   
13          None          None          None                None   
2           None          None          None                None   
3           None          None          None                None   
11          None          None          None                None   
1           None          None          None                None   
14          None          None          None                None   
4           None          None          None                None   
9           None          None          None                None   
19           NaN           NaN           NaN                 NaN   

   save_snapshot classes_count ctr_description leaf_estimation_method  \
18          None          None            None                   None   
8           None          None            None                   None   
15          None          None            None                   None   
6           None          None            None                   None   
10          None          None            None                   None   
5           None          None            None                   None   
16          None          None            None                   None   
0           None          None            None                   None   
17          None          None            None                   None   
7           None          None            None                   None   
12          None          None            None                   None   
13          None          None            None                   None   
2           None          None            None                   None   
3           None          None            None                   None   
11          None          None            None                   None   
1           None          None            None                   None   
14          None          None            None                   None   
4           None          None            None                   None   
9           None          None            None                   None   
19           NaN           NaN             NaN                    NaN   

   one_hot_max_size min_data_in_leaf random_score_type colsample_bylevel  \
18             None             None              None              None   
8              None             None              None              None   
15             None             None              None              None   
6              None             None              None              None   
10             None             None              None              None   
5              None             None              None              None   
16             None             None              None              None   
0              None             None              None              None   
17             None             None              None              None   
7              None             None              None              None   
12             None             None              None              None   
13             None             None              None              None   
2              None             None              None              None   
3              None             None              None              None   
11             None             None              None              None   
1              None             None              None              None   
14             None             None              None              None   
4              None             None              None              None   
9              None             None              None              None   
19              NaN              NaN               NaN               NaN   

   bootstrap_type custom_metric thread_count bagging_temperature  \
18           None          None         None                None   
8            None          None         None                None   
15           None          None         None                None   
6            None          None         None                None   
10           None          None         None                None   
5            None          None         None                None   
16           None          None         None                None   
0            None          None         None                None   
17           None          None         None                None   
7            None          None         None                None   
12           None          None         None                None   
13           None          None         None                None   
2            None          None         None                None   
3            None          None         None                None   
11           None          None         None                None   
1            None          None         None                None   
14           None          None         None                None   
4            None          None         None                None   
9            None          None         None                None   
19            NaN           NaN          NaN                 NaN   

   random_strength nan_mode text_features per_float_feature_quantization  \
18            None     None          None                           None   
8             None     None          None                           None   
15            None     None          None                           None   
6             None     None          None                           None   
10            None     None          None                           None   
5             None     None          None                           None   
16            None     None          None                           None   
0             None     None          None                           None   
17            None     None          None                           None   
7             None     None          None                           None   
12            None     None          None                           None   
13            None     None          None                           None   
2             None     None          None                           None   
3             None     None          None                           None   
11            None     None          None                           None   
1             None     None          None                           None   
14            None     None          None                           None   
4             None     None          None                           None   
9             None     None          None                           None   
19             NaN      NaN           NaN                            NaN   

   simple_ctr output_borders use_best_model gpu_cat_features_storage  \
18       None           None           None                     None   
8        None           None           None                     None   
15       None           None           None                     None   
6        None           None           None                     None   
10       None           None           None                     None   
5        None           None           None                     None   
16       None           None           None                     None   
0        None           None           None                     None   
17       None           None           None                     None   
7        None           None           None                     None   
12       None           None           None                     None   
13       None           None           None                     None   
2        None           None           None                     None   
3        None           None           None                     None   
11       None           None           None                     None   
1        None           None           None                     None   
14       None           None           None                     None   
4        None           None           None                     None   
9        None           None           None                     None   
19        NaN            NaN            NaN                      NaN   

   combinations_ctr border_count feature_border_type data_partition  \
18             None         None                None           None   
8              None         None                None           None   
15             None         None                None           None   
6              None         None                None           None   
10             None         None                None           None   
5              None         None                None           None   
16             None         None                None           None   
0              None         None                None           None   
17             None         None                None           None   
7              None         None                None           None   
12             None         None                None           None   
13             None         None                None           None   
2              None         None                None           None   
3              None         None                None           None   
11             None         None                None           None   
1              None         None                None           None   
14             None         None                None           None   
4              None         None                None           None   
9              None         None                None           None   
19              NaN          NaN                 NaN            NaN   

   fold_permutation_block od_pval  name early_stopping_rounds tokenizers  \
18                   None    None  None                  None       None   
8                    None    None  None                  None       None   
15                   None    None  None                  None       None   
6                    None    None  None                  None       None   
10                   None    None  None                  None       None   
5                    None    None  None                  None       None   
16                   None    None  None                  None       None   
0                    None    None  None                  None       None   
17                   None    None  None                  None       None   
7                    None    None  None                  None       None   
12                   None    None  None                  None       None   
13                   None    None  None                  None       None   
2                    None    None  None                  None       None   
3                    None    None  None                  None       None   
11                   None    None  None                  None       None   
1                    None    None  None                  None       None   
14                   None    None  None                  None       None   
4                    None    None  None                  None       None   
9                    None    None  None                  None       None   
19                    NaN     NaN   NaN                   NaN        NaN   

   best_model_min_trees dev_efb_max_buckets feature_weights  \
18                 None                None            None   
8                  None                None            None   
15                 None                None            None   
6                  None                None            None   
10                 None                None            None   
5                  None                None            None   
16                 None                None            None   
0                  None                None            None   
17                 None                None            None   
7                  None                None            None   
12                 None                None            None   
13                 None                None            None   
2                  None                None            None   
3                  None                None            None   
11                 None                None            None   
1                  None                None            None   
14                 None                None            None   
4                  None                None            None   
9                  None                None            None   
19                  NaN                 NaN             NaN   

   posterior_sampling metadata boosting_type diffusion_temperature  \
18               None     None          None                  None   
8                None     None          None                  None   
15               None     None          None                  None   
6                None     None          None                  None   
10               None     None          None                  None   
5                None     None          None                  None   
16               None     None          None                  None   
0                None     None          None                  None   
17               None     None          None                  None   
7                None     None          None                  None   
12               None     None          None                  None   
13               None     None          None                  None   
2                None     None          None                  None   
3                None     None          None                  None   
11               None     None          None                  None   
1                None     None          None                  None   
14               None     None          None                  None   
4                None     None          None                  None   
9                None     None          None                  None   
19                NaN      NaN           NaN                   NaN   

   gpu_ram_part score_function approx_on_full_history sampling_unit task_type  \
18         None           None                   None          None      None   
8          None           None                   None          None      None   
15         None           None                   None          None      None   
6          None           None                   None          None      None   
10         None           None                   None          None      None   
5          None           None                   None          None      None   
16         None           None                   None          None      None   
0          None           None                   None          None      None   
17         None           None                   None          None      None   
7          None           None                   None          None      None   
12         None           None                   None          None      None   
13         None           None                   None          None      None   
2          None           None                   None          None      None   
3          None           None                   None          None      None   
11         None           None                   None          None      None   
1          None           None                   None          None      None   
14         None           None                   None          None      None   
4          None           None                   None          None      None   
9          None           None                   None          None      None   
19          NaN            NaN                    NaN           NaN       NaN   

   snapshot_interval   rsm store_all_simple_ctr random_seed  \
18              None  None                 None        None   
8               None  None                 None        None   
15              None  None                 None        None   
6               None  None                 None        None   
10              None  None                 None        None   
5               None  None                 None        None   
16              None  None                 None        None   
0               None  None                 None        None   
17              None  None                 None        None   
7               None  None                 None        None   
12              None  None                 None        None   
13              None  None                 None        None   
2               None  None                 None        None   
3               None  None                 None        None   
11              None  None                 None        None   
1               None  None                 None        None   
14              None  None                 None        None   
4               None  None                 None        None   
9               None  None                 None        None   
19               NaN   NaN                  NaN         NaN   

   sampling_frequency ctr_target_border_count final_ctr_computation_mode  \
18               None                    None                       None   
8                None                    None                       None   
15               None                    None                       None   
6                None                    None                       None   
10               None                    None                       None   
5                None                    None                       None   
16               None                    None                       None   
0                None                    None                       None   
17               None                    None                       None   
7                None                    None                       None   
12               None                    None                       None   
13               None                    None                       None   
2                None                    None                       None   
3                None                    None                       None   
11               None                    None                       None   
1                None                    None                       None   
14               None                    None                       None   
4                None                    None                       None   
9                None                    None                       None   
19                NaN                     NaN                        NaN   

   fixed_binary_splits auto_class_weights ctr_history_unit device_config  \
18                None               None             None          None   
8                 None               None             None          None   
15                None               None             None          None   
6                 None               None             None          None   
10                None               None             None          None   
5                 None               None             None          None   
16                None               None             None          None   
0                 None               None             None          None   
17                None               None             None          None   
7                 None               None             None          None   
12                None               None             None          None   
13                None               None             None          None   
2                 None               None             None          None   
3                 None               None             None          None   
11                None               None             None          None   
1                 None               None             None          None   
14                None               None             None          None   
4                 None               None             None          None   
9                 None               None             None          None   
19                 NaN                NaN              NaN           NaN   

   leaf_estimation_backtracking has_time fold_len_multiplier  \
18                         None     None                None   
8                          None     None                None   
15                         None     None                None   
6                          None     None                None   
10                         None     None                None   
5                          None     None                None   
16                         None     None                None   
0                          None     None                None   
17                         None     None                None   
7                          None     None                None   
12                         None     None                None   
13                         None     None                None   
2                          None     None                None   
3                          None     None                None   
11                         None     None                None   
1                          None     None                None   
14                         None     None                None   
4                          None     None                None   
9                          None     None                None   
19                          NaN      NaN                 NaN   

   pinned_memory_size feature_calcers model_shrink_rate od_type  \
18               None            None              None    None   
8                None            None              None    None   
15               None            None              None    None   
6                None            None              None    None   
10               None            None              None    None   
5                None            None              None    None   
16               None            None              None    None   
0                None            None              None    None   
17               None            None              None    None   
7                None            None              None    None   
12               None            None              None    None   
13               None            None              None    None   
2                None            None              None    None   
3                None            None              None    None   
11               None            None              None    None   
1                None            None              None    None   
14               None            None              None    None   
4                None            None              None    None   
9                None            None              None    None   
19                NaN             NaN               NaN     NaN   

   monotone_constraints dictionaries max_bin boost_from_average grow_policy  \
18                 None         None    None               None        None   
8                  None         None    None               None        None   
15                 None         None    None               None        None   
6                  None         None    None               None        None   
10                 None         None    None               None        None   
5                  None         None    None               None        None   
16                 None         None    None               None        None   
0                  None         None    None               None        None   
17                 None         None    None               None        None   
7                  None         None    None               None        None   
12                 None         None    None               None        None   
13                 None         None    None               None        None   
2                  None         None    None               None        None   
3                  None         None    None               None        None   
11                 None         None    None               None        None   
1                  None         None    None               None        None   
14                 None         None    None               None        None   
4                  None         None    None               None        None   
9                  None         None    None               None        None   
19                  NaN          NaN     NaN                NaN         NaN   

   embedding_features langevin callback cat_features train_dir  \
18               None     None     None         None      None   
8                None     None     None         None      None   
15               None     None     None         None      None   
6                None     None     None         None      None   
10               None     None     None         None      None   
5                None     None     None         None      None   
16               None     None     None         None      None   
0                None     None     None         None      None   
17               None     None     None         None      None   
7                None     None     None         None      None   
12               None     None     None         None      None   
13               None     None     None         None      None   
2                None     None     None         None      None   
3                None     None     None         None      None   
11               None     None     None         None      None   
1                None     None     None         None      None   
14               None     None     None         None      None   
4                None     None     None         None      None   
9                None     None     None         None      None   
19                NaN      NaN      NaN          NaN       NaN   

   sparse_features_conflict_fraction ignored_features num_trees  \
18                              None             None      None   
8                               None             None      None   
15                              None             None      None   
6                               None             None      None   
10                              None             None      None   
5                               None             None      None   
16                              None             None      None   
0                               None             None      None   
17                              None             None      None   
7                               None             None      None   
12                              None             None      None   
13                              None             None      None   
2                               None             None      None   
3                               None             None      None   
11                              None             None      None   
1                               None             None      None   
14                              None             None      None   
4                               None             None      None   
9                               None             None      None   
19                               NaN              NaN       NaN   

   penalties_coefficient objective used_ram_limit text_processing reg_lambda  \
18                  None      None           None            None       None   
8                   None      None           None            None       None   
15                  None      None           None            None       None   
6                   None      None           None            None       None   
10                  None      None           None            None       None   
5                   None      None           None            None       None   
16                  None      None           None            None       None   
0                   None      None           None            None       None   
17                  None      None           None            None       None   
7                   None      None           None            None       None   
12                  None      None           None            None       None   
13                  None      None           None            None       None   
2                   None      None           None            None       None   
3                   None      None           None            None       None   
11                  None      None           None            None       None   
1                   None      None           None            None       None   
14                  None      None           None            None       None   
4                   None      None           None            None       None   
9                   None      None           None            None       None   
19                   NaN      None            NaN             NaN        NaN   

   snapshot_file random_state custom_loss loss_function  \
18          None         None        None          None   
8           None         None        None          None   
15          None         None        None          None   
6           None         None        None          None   
10          None         None        None          None   
5           None         None        None          None   
16          None         None        None          None   
0           None         None        None          None   
17          None         None        None          None   
7           None         None        None          None   
12          None         None        None          None   
13          None         None        None          None   
2           None         None        None          None   
3           None         None        None          None   
11          None         None        None          None   
1           None         None        None          None   
14          None         None        None          None   
4           None         None        None          None   
9           None         None        None          None   
19           NaN          NaN         NaN           NaN   

   leaf_estimation_iterations silent max_leaves input_borders  \
18                       None   None       None          None   
8                        None   None       None          None   
15                       None   None       None          None   
6                        None   None       None          None   
10                       None   None       None          None   
5                        None   None       None          None   
16                       None   None       None          None   
0                        None   None       None          None   
17                       None   None       None          None   
7                        None   None       None          None   
12                       None   None       None          None   
13                       None   None       None          None   
2                        None   None       None          None   
3                        None   None       None          None   
11                       None   None       None          None   
1                        None   None       None          None   
14                       None   None       None          None   
4                        None   None       None          None   
9                        None   None       None          None   
19                        NaN    NaN        NaN           NaN   

   counter_calc_method num_boost_round model_size_reg eval_metric num_leaves  \
18                None            None           None        None       None   
8                 None            None           None        None       None   
15                None            None           None        None       None   
6                 None            None           None        None       None   
10                None            None           None        None       None   
5                 None            None           None        None       None   
16                None            None           None        None       None   
0                 None            None           None        None       None   
17                None            None           None        None       None   
7                 None            None           None        None       None   
12                None            None           None        None       None   
13                None            None           None        None       None   
2                 None            None           None        None       None   
3                 None            None           None        None       None   
11                None            None           None        None       None   
1                 None            None           None        None       None   
14                None            None           None        None       None   
4                 None            None           None        None       None   
9                 None            None           None        None       None   
19                 NaN             NaN            NaN         NaN        NaN   

   min_child_samples class_names logging_level first_feature_use_penalties  \
18              None        None          None                        None   
8               None        None          None                        None   
15              None        None          None                        None   
6               None        None          None                        None   
10              None        None          None                        None   
5               None        None          None                        None   
16              None        None          None                        None   
0               None        None          None                        None   
17              None        None          None                        None   
7               None        None          None                        None   
12              None        None          None                        None   
13              None        None          None                        None   
2               None        None          None                        None   
3               None        None          None                        None   
11              None        None          None                        None   
1               None        None          None                        None   
14              None        None          None                        None   
4               None        None          None                        None   
9               None        None          None                        None   
19               NaN         NaN           NaN                         NaN   

   od_wait kwargs  
18    None    NaN  
8     None    NaN  
15    None    NaN  
6     None    NaN  
10    None    NaN  
5     None    NaN  
16    None    NaN  
0     None    NaN  
17    None    NaN  
7     None    NaN  
12    None    NaN  
13    None    NaN  
2     None    NaN  
3     None    NaN  
11    None    NaN  
1     None    NaN  
14    None    NaN  
4     None    NaN  
9     None    NaN  
19     NaN   None  

✅ Resultados guardados en 'resultados_validacion_cruzada_sin_smote.csv'
CPU times: user 7min 56s, sys: 8.84 s, total: 8min 5s
Wall time: 5min 5s

4. Optimización de Hiperparámetros con GridSearchCV (con SMOTE)¶

GridSearchCV permite probar una combinación exhaustiva de hiperparámetros para encontrar la mejor configuración.

In [ ]:
# ==========================================================
# GridSearchCV con SMOTE en Pipeline
# ==========================================================
%%time

from sklearn.model_selection import train_test_split, StratifiedKFold, GridSearchCV
from imblearn.over_sampling import SMOTE
from imblearn.pipeline import Pipeline as ImbPipeline
from xgboost import XGBClassifier
from catboost import CatBoostClassifier
import pandas as pd

# ================================================
# 1. División del Dataset
# ================================================
print("\n🔄 Dividiendo el conjunto de datos en entrenamiento y prueba...")
X_train_full, X_test, y_train_full, y_test = train_test_split(
    X, y, test_size=0.2, random_state=42, stratify=y
)
print(f"✅ Tamaño de entrenamiento: {len(X_train_full)} | Tamaño de prueba: {len(X_test)}")

# ================================================
# 2. Configuración de Parámetros
# ================================================
param_grid_xgb = {
    'classifier__learning_rate': [0.01, 0.05],
    'classifier__max_depth': [3, 4],
    'classifier__n_estimators': [100, 200],
    'classifier__scale_pos_weight': [5, 10]
}

param_grid_catboost = {
    'classifier__iterations': [200, 300],
    'classifier__learning_rate': [0.01, 0.05],
    'classifier__depth': [4, 6],
    'classifier__class_weights': [[1, 50]]
}

# ================================================
# 3. Inicialización del DataFrame de Resultados
# ================================================
columnas_resultados = [
    'Modelo', 'Tecnica', 'Sobreajuste',
    'Precision_Train', 'Recall_Train', 'AUPRC_Train', 'Accuracy_Train', 'F1-Score_Train',
    'Precision_Test', 'Recall_Test', 'AUPRC_Test', 'Accuracy_Test', 'F1-Score_Test',
    'learning_rate', 'max_depth', 'n_estimators', 'scale_pos_weight',
    'iterations', 'depth', 'class_weights', 'Fold'
]
resultados_gridsearch_con_smote = pd.DataFrame(columns=columnas_resultados)

# ================================================
# 4. Función GridSearchCV con SMOTE en Pipeline
# ================================================
def gridsearch_cv_con_smote_pipeline(modelo, param_grid, nombre_modelo, resultados_df):
    print(f"\n🚀 Iniciando GridSearchCV con SMOTE para {nombre_modelo}...")

    # Configuración del pipeline con SMOTE
    pipeline = ImbPipeline([
        ('smote', SMOTE(random_state=42)),
        ('classifier', modelo())
    ])

    # Configuración de GridSearchCV con validación cruzada
    grid_search = GridSearchCV(
        estimator=pipeline,
        param_grid=param_grid,
        scoring='average_precision',
        cv=StratifiedKFold(n_splits=3, shuffle=True, random_state=42),
        verbose=2,
        n_jobs=-1
    )

    # Ejecución del GridSearchCV
    grid_search.fit(X_train_full, y_train_full)

    # Obtener los mejores parámetros encontrados
    mejores_parametros = grid_search.best_params_
    print(f"\n✅ Mejores parámetros para {nombre_modelo}: {mejores_parametros}")

    # Evaluación del modelo final con los mejores parámetros
    print("\n📈 Evaluando el modelo final con los mejores hiperparámetros...")
    mejores_parametros = {k.split("__")[-1]: v for k, v in mejores_parametros.items()}  # Limpiar nombres de parámetros
    resultados_df = entrenar_y_evaluar(
        modelo=modelo,
        nombre_modelo=nombre_modelo,
        parametros=mejores_parametros,
        X_train=X_train_full,
        y_train=y_train_full,
        X_test=X_test,
        y_test=y_test,
        tecnica="GridSearchCV con SMOTE",
        resultados_df=resultados_df
    )
    return resultados_df

# ================================================
# 5. Optimización y Evaluación
# ================================================
# Ejecutar GridSearchCV para XGBoost
print("\n🚀 GridSearchCV para XGBoost...")
resultados_gridsearch_con_smote = gridsearch_cv_con_smote_pipeline(
    XGBClassifier, param_grid_xgb, "XGBoost", resultados_gridsearch_con_smote
)

# Ejecutar GridSearchCV para CatBoost
print("\n🚀 GridSearchCV para CatBoost...")
resultados_gridsearch_con_smote = gridsearch_cv_con_smote_pipeline(
    CatBoostClassifier, param_grid_catboost, "CatBoost", resultados_gridsearch_con_smote
)

# ================================================
# 6. Consolidar y Guardar Resultados
# ================================================
print("\n📊 Consolidando y ordenando resultados...")
resultados_ordenados = resultados_gridsearch_con_smote.sort_values(
    by=['AUPRC_Test', 'Recall_Test', 'Precision_Test', 'F1-Score_Test'],
    ascending=[False, False, False, False]
)

print("\n🏆 Resultados Finales Ordenados:")
print(resultados_ordenados)

# Guardar resultados en CSV
output_file = "resultados_gridsearch_con_smote.csv"
resultados_ordenados.to_csv(output_file, index=False)
print(f"\n✅ Resultados guardados en '{output_file}'")
🔄 Dividiendo el conjunto de datos en entrenamiento y prueba...
✅ Tamaño de entrenamiento: 179774 | Tamaño de prueba: 44944

🚀 GridSearchCV para XGBoost...

🚀 Iniciando GridSearchCV con SMOTE para XGBoost...
Fitting 3 folds for each of 16 candidates, totalling 48 fits

✅ Mejores parámetros para XGBoost: {'classifier__learning_rate': 0.01, 'classifier__max_depth': 3, 'classifier__n_estimators': 100, 'classifier__scale_pos_weight': 5}

📈 Evaluando el modelo final con los mejores hiperparámetros...

🚀 Entrenando XGBoost (GridSearchCV con SMOTE)...

✅ Resultados para XGBoost (GridSearchCV con SMOTE):
 - Modelo: XGBoost
 - Tecnica: GridSearchCV con SMOTE
 - Sobreajuste: 0
 - Precision_Train: 85.71
 - Recall_Train: 83.44
 - AUPRC_Train: 82.03
 - Accuracy_Train: 99.95
 - F1-Score_Train: 84.56
 - Precision_Test: 86.15
 - Recall_Test: 73.68
 - AUPRC_Test: 76.72
 - Accuracy_Test: 99.94
 - F1-Score_Test: 79.43
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 23)

🚀 GridSearchCV para CatBoost...

🚀 Iniciando GridSearchCV con SMOTE para CatBoost...
Fitting 3 folds for each of 8 candidates, totalling 24 fits
0:	learn: 0.5621320	total: 81ms	remaining: 16.1s
1:	learn: 0.4673137	total: 164ms	remaining: 16.3s
2:	learn: 0.3874868	total: 242ms	remaining: 15.9s
3:	learn: 0.3357522	total: 322ms	remaining: 15.8s
4:	learn: 0.2698302	total: 406ms	remaining: 15.8s
5:	learn: 0.2303591	total: 504ms	remaining: 16.3s
6:	learn: 0.1945195	total: 590ms	remaining: 16.3s
7:	learn: 0.1700396	total: 687ms	remaining: 16.5s
8:	learn: 0.1483094	total: 827ms	remaining: 17.5s
9:	learn: 0.1276070	total: 972ms	remaining: 18.5s
10:	learn: 0.1112331	total: 1.11s	remaining: 19.1s
11:	learn: 0.1004515	total: 1.28s	remaining: 20s
12:	learn: 0.0897486	total: 1.44s	remaining: 20.7s
13:	learn: 0.0786802	total: 1.62s	remaining: 21.6s
14:	learn: 0.0734134	total: 1.79s	remaining: 22.1s
15:	learn: 0.0694083	total: 1.95s	remaining: 22.4s
16:	learn: 0.0653395	total: 2.1s	remaining: 22.7s
17:	learn: 0.0594468	total: 2.25s	remaining: 22.8s
18:	learn: 0.0572786	total: 2.38s	remaining: 22.7s
19:	learn: 0.0550216	total: 2.52s	remaining: 22.7s
20:	learn: 0.0534527	total: 2.67s	remaining: 22.7s
21:	learn: 0.0491825	total: 2.85s	remaining: 23s
22:	learn: 0.0455117	total: 3.02s	remaining: 23.2s
23:	learn: 0.0440173	total: 3.17s	remaining: 23.3s
24:	learn: 0.0411541	total: 3.35s	remaining: 23.5s
25:	learn: 0.0390223	total: 3.51s	remaining: 23.5s
26:	learn: 0.0372386	total: 3.69s	remaining: 23.7s
27:	learn: 0.0354378	total: 3.86s	remaining: 23.7s
28:	learn: 0.0345060	total: 4.02s	remaining: 23.7s
29:	learn: 0.0330125	total: 4.18s	remaining: 23.7s
30:	learn: 0.0321304	total: 4.37s	remaining: 23.8s
31:	learn: 0.0313499	total: 4.51s	remaining: 23.7s
32:	learn: 0.0307994	total: 4.67s	remaining: 23.6s
33:	learn: 0.0298402	total: 4.84s	remaining: 23.6s
34:	learn: 0.0286104	total: 5.01s	remaining: 23.6s
35:	learn: 0.0281581	total: 5.17s	remaining: 23.5s
36:	learn: 0.0271822	total: 5.33s	remaining: 23.5s
37:	learn: 0.0264439	total: 5.5s	remaining: 23.4s
38:	learn: 0.0260006	total: 5.65s	remaining: 23.3s
39:	learn: 0.0246847	total: 5.83s	remaining: 23.3s
40:	learn: 0.0241099	total: 6s	remaining: 23.3s
41:	learn: 0.0236035	total: 6.15s	remaining: 23.1s
42:	learn: 0.0231026	total: 6.3s	remaining: 23s
43:	learn: 0.0226051	total: 6.47s	remaining: 22.9s
44:	learn: 0.0220145	total: 6.64s	remaining: 22.9s
45:	learn: 0.0214854	total: 6.81s	remaining: 22.8s
46:	learn: 0.0209492	total: 6.98s	remaining: 22.7s
47:	learn: 0.0205483	total: 7.14s	remaining: 22.6s
48:	learn: 0.0203594	total: 7.29s	remaining: 22.5s
49:	learn: 0.0199105	total: 7.47s	remaining: 22.4s
50:	learn: 0.0194793	total: 7.64s	remaining: 22.3s
51:	learn: 0.0192714	total: 7.74s	remaining: 22s
52:	learn: 0.0189622	total: 7.83s	remaining: 21.7s
53:	learn: 0.0186489	total: 7.95s	remaining: 21.5s
54:	learn: 0.0182248	total: 8.03s	remaining: 21.2s
55:	learn: 0.0179210	total: 8.12s	remaining: 20.9s
56:	learn: 0.0176604	total: 8.19s	remaining: 20.6s
57:	learn: 0.0174836	total: 8.28s	remaining: 20.3s
58:	learn: 0.0171977	total: 8.36s	remaining: 20s
59:	learn: 0.0169757	total: 8.45s	remaining: 19.7s
60:	learn: 0.0168391	total: 8.53s	remaining: 19.4s
61:	learn: 0.0165148	total: 8.62s	remaining: 19.2s
62:	learn: 0.0162705	total: 8.71s	remaining: 18.9s
63:	learn: 0.0160549	total: 8.8s	remaining: 18.7s
64:	learn: 0.0157736	total: 8.89s	remaining: 18.5s
65:	learn: 0.0155836	total: 8.99s	remaining: 18.3s
66:	learn: 0.0153365	total: 9.08s	remaining: 18s
67:	learn: 0.0151685	total: 9.16s	remaining: 17.8s
68:	learn: 0.0149886	total: 9.24s	remaining: 17.6s
69:	learn: 0.0148523	total: 9.33s	remaining: 17.3s
70:	learn: 0.0146756	total: 9.41s	remaining: 17.1s
71:	learn: 0.0144205	total: 9.5s	remaining: 16.9s
72:	learn: 0.0143045	total: 9.59s	remaining: 16.7s
73:	learn: 0.0140896	total: 9.68s	remaining: 16.5s
74:	learn: 0.0139397	total: 9.77s	remaining: 16.3s
75:	learn: 0.0137166	total: 9.85s	remaining: 16.1s
76:	learn: 0.0135607	total: 9.93s	remaining: 15.9s
77:	learn: 0.0133683	total: 10s	remaining: 15.7s
78:	learn: 0.0131737	total: 10.1s	remaining: 15.5s
79:	learn: 0.0130434	total: 10.2s	remaining: 15.3s
80:	learn: 0.0128852	total: 10.3s	remaining: 15.1s
81:	learn: 0.0127519	total: 10.4s	remaining: 14.9s
82:	learn: 0.0126001	total: 10.4s	remaining: 14.7s
83:	learn: 0.0125266	total: 10.5s	remaining: 14.5s
84:	learn: 0.0124220	total: 10.6s	remaining: 14.4s
85:	learn: 0.0122933	total: 10.7s	remaining: 14.2s
86:	learn: 0.0121649	total: 10.8s	remaining: 14s
87:	learn: 0.0120401	total: 10.9s	remaining: 13.8s
88:	learn: 0.0118874	total: 11s	remaining: 13.7s
89:	learn: 0.0117395	total: 11.1s	remaining: 13.5s
90:	learn: 0.0116151	total: 11.1s	remaining: 13.3s
91:	learn: 0.0114372	total: 11.2s	remaining: 13.2s
92:	learn: 0.0113382	total: 11.3s	remaining: 13s
93:	learn: 0.0112344	total: 11.4s	remaining: 12.8s
94:	learn: 0.0111508	total: 11.5s	remaining: 12.7s
95:	learn: 0.0110575	total: 11.6s	remaining: 12.5s
96:	learn: 0.0110224	total: 11.6s	remaining: 12.3s
97:	learn: 0.0109721	total: 11.7s	remaining: 12.2s
98:	learn: 0.0108977	total: 11.8s	remaining: 12s
99:	learn: 0.0108021	total: 11.9s	remaining: 11.9s
100:	learn: 0.0106407	total: 12s	remaining: 11.7s
101:	learn: 0.0105588	total: 12.1s	remaining: 11.6s
102:	learn: 0.0103958	total: 12.1s	remaining: 11.4s
103:	learn: 0.0103143	total: 12.2s	remaining: 11.3s
104:	learn: 0.0101623	total: 12.3s	remaining: 11.2s
105:	learn: 0.0100963	total: 12.4s	remaining: 11s
106:	learn: 0.0100159	total: 12.5s	remaining: 10.8s
107:	learn: 0.0099613	total: 12.6s	remaining: 10.7s
108:	learn: 0.0098437	total: 12.7s	remaining: 10.6s
109:	learn: 0.0097813	total: 12.7s	remaining: 10.4s
110:	learn: 0.0096740	total: 12.8s	remaining: 10.3s
111:	learn: 0.0095381	total: 12.9s	remaining: 10.2s
112:	learn: 0.0094433	total: 13s	remaining: 10s
113:	learn: 0.0093650	total: 13.1s	remaining: 9.88s
114:	learn: 0.0092404	total: 13.2s	remaining: 9.74s
115:	learn: 0.0091557	total: 13.3s	remaining: 9.62s
116:	learn: 0.0090533	total: 13.4s	remaining: 9.49s
117:	learn: 0.0089990	total: 13.5s	remaining: 9.35s
118:	learn: 0.0089378	total: 13.5s	remaining: 9.22s
119:	learn: 0.0088456	total: 13.6s	remaining: 9.09s
120:	learn: 0.0087624	total: 13.7s	remaining: 8.96s
121:	learn: 0.0086804	total: 13.8s	remaining: 8.83s
122:	learn: 0.0086315	total: 13.9s	remaining: 8.7s
123:	learn: 0.0085761	total: 14s	remaining: 8.57s
124:	learn: 0.0084932	total: 14.1s	remaining: 8.44s
125:	learn: 0.0084496	total: 14.2s	remaining: 8.31s
126:	learn: 0.0084044	total: 14.2s	remaining: 8.18s
127:	learn: 0.0083549	total: 14.3s	remaining: 8.05s
128:	learn: 0.0082380	total: 14.4s	remaining: 7.93s
129:	learn: 0.0081895	total: 14.5s	remaining: 7.8s
130:	learn: 0.0081110	total: 14.6s	remaining: 7.67s
131:	learn: 0.0080657	total: 14.6s	remaining: 7.54s
132:	learn: 0.0080158	total: 14.7s	remaining: 7.42s
133:	learn: 0.0079475	total: 14.8s	remaining: 7.29s
134:	learn: 0.0078856	total: 14.9s	remaining: 7.17s
135:	learn: 0.0078240	total: 15s	remaining: 7.05s
136:	learn: 0.0077747	total: 15.1s	remaining: 6.93s
137:	learn: 0.0077064	total: 15.2s	remaining: 6.81s
138:	learn: 0.0076679	total: 15.2s	remaining: 6.69s
139:	learn: 0.0076018	total: 15.3s	remaining: 6.57s
140:	learn: 0.0075128	total: 15.4s	remaining: 6.46s
141:	learn: 0.0074608	total: 15.5s	remaining: 6.33s
142:	learn: 0.0074198	total: 15.6s	remaining: 6.21s
143:	learn: 0.0073770	total: 15.7s	remaining: 6.09s
144:	learn: 0.0073032	total: 15.8s	remaining: 5.98s
145:	learn: 0.0072599	total: 15.9s	remaining: 5.86s
146:	learn: 0.0072053	total: 15.9s	remaining: 5.75s
147:	learn: 0.0071619	total: 16s	remaining: 5.63s
148:	learn: 0.0071253	total: 16.1s	remaining: 5.51s
149:	learn: 0.0070838	total: 16.2s	remaining: 5.39s
150:	learn: 0.0070036	total: 16.3s	remaining: 5.28s
151:	learn: 0.0069443	total: 16.4s	remaining: 5.17s
152:	learn: 0.0068980	total: 16.4s	remaining: 5.05s
153:	learn: 0.0068599	total: 16.5s	remaining: 4.94s
154:	learn: 0.0068111	total: 16.6s	remaining: 4.83s
155:	learn: 0.0067799	total: 16.7s	remaining: 4.71s
156:	learn: 0.0067456	total: 16.8s	remaining: 4.6s
157:	learn: 0.0067180	total: 16.9s	remaining: 4.49s
158:	learn: 0.0066751	total: 17s	remaining: 4.37s
159:	learn: 0.0066146	total: 17s	remaining: 4.26s
160:	learn: 0.0065924	total: 17.1s	remaining: 4.15s
161:	learn: 0.0065311	total: 17.2s	remaining: 4.04s
162:	learn: 0.0064994	total: 17.3s	remaining: 3.93s
163:	learn: 0.0064756	total: 17.4s	remaining: 3.82s
164:	learn: 0.0064587	total: 17.5s	remaining: 3.7s
165:	learn: 0.0064039	total: 17.5s	remaining: 3.59s
166:	learn: 0.0063743	total: 17.6s	remaining: 3.48s
167:	learn: 0.0063386	total: 17.8s	remaining: 3.38s
168:	learn: 0.0062980	total: 17.9s	remaining: 3.28s
169:	learn: 0.0062741	total: 18s	remaining: 3.18s
170:	learn: 0.0062389	total: 18.2s	remaining: 3.08s
171:	learn: 0.0062160	total: 18.3s	remaining: 2.98s
172:	learn: 0.0061222	total: 18.5s	remaining: 2.89s
173:	learn: 0.0060856	total: 18.7s	remaining: 2.79s
174:	learn: 0.0060522	total: 18.8s	remaining: 2.69s
175:	learn: 0.0060404	total: 19s	remaining: 2.59s
176:	learn: 0.0060046	total: 19.1s	remaining: 2.49s
177:	learn: 0.0059601	total: 19.3s	remaining: 2.39s
178:	learn: 0.0059246	total: 19.5s	remaining: 2.28s
179:	learn: 0.0058865	total: 19.6s	remaining: 2.18s
180:	learn: 0.0058636	total: 19.8s	remaining: 2.07s
181:	learn: 0.0058344	total: 19.9s	remaining: 1.97s
182:	learn: 0.0057422	total: 20.1s	remaining: 1.87s
183:	learn: 0.0057066	total: 20.3s	remaining: 1.76s
184:	learn: 0.0056692	total: 20.4s	remaining: 1.66s
185:	learn: 0.0056282	total: 20.6s	remaining: 1.55s
186:	learn: 0.0055983	total: 20.8s	remaining: 1.44s
187:	learn: 0.0055651	total: 20.9s	remaining: 1.34s
188:	learn: 0.0055377	total: 21.1s	remaining: 1.23s
189:	learn: 0.0055055	total: 21.2s	remaining: 1.12s
190:	learn: 0.0054871	total: 21.4s	remaining: 1.01s
191:	learn: 0.0054741	total: 21.5s	remaining: 898ms
192:	learn: 0.0054349	total: 21.7s	remaining: 787ms
193:	learn: 0.0054105	total: 21.8s	remaining: 676ms
194:	learn: 0.0053863	total: 22s	remaining: 564ms
195:	learn: 0.0053604	total: 22.1s	remaining: 452ms
196:	learn: 0.0053372	total: 22.3s	remaining: 340ms
197:	learn: 0.0053058	total: 22.5s	remaining: 227ms
198:	learn: 0.0052813	total: 22.6s	remaining: 114ms
199:	learn: 0.0052651	total: 22.8s	remaining: 0us

✅ Mejores parámetros para CatBoost: {'classifier__class_weights': [1, 50], 'classifier__depth': 4, 'classifier__iterations': 200, 'classifier__learning_rate': 0.05}

📈 Evaluando el modelo final con los mejores hiperparámetros...

🚀 Entrenando CatBoost (GridSearchCV con SMOTE)...
0:	learn: 0.5962440	total: 67.5ms	remaining: 13.4s
1:	learn: 0.5060186	total: 155ms	remaining: 15.3s
2:	learn: 0.4281567	total: 240ms	remaining: 15.8s
3:	learn: 0.3700498	total: 330ms	remaining: 16.2s
4:	learn: 0.3151599	total: 417ms	remaining: 16.3s
5:	learn: 0.2766629	total: 492ms	remaining: 15.9s
6:	learn: 0.2407388	total: 533ms	remaining: 14.7s
7:	learn: 0.2122575	total: 583ms	remaining: 14s
8:	learn: 0.1876715	total: 625ms	remaining: 13.3s
9:	learn: 0.1667443	total: 669ms	remaining: 12.7s
10:	learn: 0.1495779	total: 729ms	remaining: 12.5s
11:	learn: 0.1355420	total: 775ms	remaining: 12.1s
12:	learn: 0.1242897	total: 816ms	remaining: 11.7s
13:	learn: 0.1135635	total: 859ms	remaining: 11.4s
14:	learn: 0.1053080	total: 899ms	remaining: 11.1s
15:	learn: 0.0981068	total: 946ms	remaining: 10.9s
16:	learn: 0.0913846	total: 995ms	remaining: 10.7s
17:	learn: 0.0866187	total: 1.04s	remaining: 10.5s
18:	learn: 0.0810748	total: 1.07s	remaining: 10.2s
19:	learn: 0.0767805	total: 1.11s	remaining: 10s
20:	learn: 0.0733899	total: 1.16s	remaining: 9.85s
21:	learn: 0.0694741	total: 1.2s	remaining: 9.72s
22:	learn: 0.0671769	total: 1.25s	remaining: 9.6s
23:	learn: 0.0651553	total: 1.28s	remaining: 9.43s
24:	learn: 0.0630339	total: 1.34s	remaining: 9.38s
25:	learn: 0.0615501	total: 1.38s	remaining: 9.25s
26:	learn: 0.0598661	total: 1.43s	remaining: 9.13s
27:	learn: 0.0586781	total: 1.46s	remaining: 8.99s
28:	learn: 0.0573095	total: 1.5s	remaining: 8.85s
29:	learn: 0.0557567	total: 1.54s	remaining: 8.74s
30:	learn: 0.0545826	total: 1.58s	remaining: 8.63s
31:	learn: 0.0538371	total: 1.62s	remaining: 8.53s
32:	learn: 0.0529377	total: 1.67s	remaining: 8.44s
33:	learn: 0.0517223	total: 1.72s	remaining: 8.41s
34:	learn: 0.0508631	total: 1.76s	remaining: 8.31s
35:	learn: 0.0504734	total: 1.8s	remaining: 8.22s
36:	learn: 0.0497844	total: 1.84s	remaining: 8.13s
37:	learn: 0.0493229	total: 1.89s	remaining: 8.07s
38:	learn: 0.0487210	total: 1.93s	remaining: 7.98s
39:	learn: 0.0475368	total: 1.98s	remaining: 7.91s
40:	learn: 0.0472593	total: 2.02s	remaining: 7.82s
41:	learn: 0.0467502	total: 2.06s	remaining: 7.75s
42:	learn: 0.0461880	total: 2.1s	remaining: 7.67s
43:	learn: 0.0458088	total: 2.14s	remaining: 7.59s
44:	learn: 0.0451736	total: 2.18s	remaining: 7.52s
45:	learn: 0.0444848	total: 2.22s	remaining: 7.45s
46:	learn: 0.0441553	total: 2.26s	remaining: 7.37s
47:	learn: 0.0435476	total: 2.31s	remaining: 7.31s
48:	learn: 0.0428737	total: 2.36s	remaining: 7.26s
49:	learn: 0.0425074	total: 2.4s	remaining: 7.19s
50:	learn: 0.0418440	total: 2.44s	remaining: 7.12s
51:	learn: 0.0415026	total: 2.48s	remaining: 7.04s
52:	learn: 0.0409429	total: 2.53s	remaining: 7.01s
53:	learn: 0.0406427	total: 2.58s	remaining: 6.96s
54:	learn: 0.0401580	total: 2.62s	remaining: 6.91s
55:	learn: 0.0399178	total: 2.66s	remaining: 6.84s
56:	learn: 0.0395445	total: 2.7s	remaining: 6.78s
57:	learn: 0.0388644	total: 2.77s	remaining: 6.77s
58:	learn: 0.0386943	total: 2.81s	remaining: 6.71s
59:	learn: 0.0382717	total: 2.85s	remaining: 6.65s
60:	learn: 0.0380538	total: 2.89s	remaining: 6.58s
61:	learn: 0.0377452	total: 2.93s	remaining: 6.52s
62:	learn: 0.0374716	total: 2.98s	remaining: 6.47s
63:	learn: 0.0372795	total: 3.02s	remaining: 6.41s
64:	learn: 0.0370574	total: 3.06s	remaining: 6.35s
65:	learn: 0.0367415	total: 3.09s	remaining: 6.28s
66:	learn: 0.0364945	total: 3.13s	remaining: 6.22s
67:	learn: 0.0362950	total: 3.18s	remaining: 6.18s
68:	learn: 0.0360710	total: 3.23s	remaining: 6.13s
69:	learn: 0.0357919	total: 3.27s	remaining: 6.08s
70:	learn: 0.0356736	total: 3.32s	remaining: 6.03s
71:	learn: 0.0353368	total: 3.37s	remaining: 5.99s
72:	learn: 0.0351910	total: 3.41s	remaining: 5.94s
73:	learn: 0.0349152	total: 3.45s	remaining: 5.88s
74:	learn: 0.0347613	total: 3.5s	remaining: 5.83s
75:	learn: 0.0345782	total: 3.53s	remaining: 5.77s
76:	learn: 0.0341723	total: 3.58s	remaining: 5.72s
77:	learn: 0.0340209	total: 3.63s	remaining: 5.67s
78:	learn: 0.0338478	total: 3.67s	remaining: 5.61s
79:	learn: 0.0336717	total: 3.71s	remaining: 5.56s
80:	learn: 0.0333733	total: 3.76s	remaining: 5.52s
81:	learn: 0.0331672	total: 3.81s	remaining: 5.48s
82:	learn: 0.0329435	total: 3.85s	remaining: 5.43s
83:	learn: 0.0327377	total: 3.9s	remaining: 5.39s
84:	learn: 0.0326197	total: 3.94s	remaining: 5.33s
85:	learn: 0.0324197	total: 3.98s	remaining: 5.28s
86:	learn: 0.0322669	total: 4.03s	remaining: 5.23s
87:	learn: 0.0321413	total: 4.07s	remaining: 5.18s
88:	learn: 0.0319780	total: 4.12s	remaining: 5.13s
89:	learn: 0.0317415	total: 4.16s	remaining: 5.08s
90:	learn: 0.0315471	total: 4.2s	remaining: 5.03s
91:	learn: 0.0314032	total: 4.24s	remaining: 4.98s
92:	learn: 0.0311690	total: 4.28s	remaining: 4.93s
93:	learn: 0.0310664	total: 4.34s	remaining: 4.89s
94:	learn: 0.0307512	total: 4.38s	remaining: 4.84s
95:	learn: 0.0306105	total: 4.42s	remaining: 4.79s
96:	learn: 0.0305279	total: 4.46s	remaining: 4.73s
97:	learn: 0.0303865	total: 4.51s	remaining: 4.69s
98:	learn: 0.0301492	total: 4.55s	remaining: 4.64s
99:	learn: 0.0299227	total: 4.59s	remaining: 4.59s
100:	learn: 0.0297941	total: 4.63s	remaining: 4.54s
101:	learn: 0.0295705	total: 4.67s	remaining: 4.49s
102:	learn: 0.0293198	total: 4.73s	remaining: 4.45s
103:	learn: 0.0290254	total: 4.77s	remaining: 4.4s
104:	learn: 0.0289065	total: 4.83s	remaining: 4.37s
105:	learn: 0.0287437	total: 4.87s	remaining: 4.32s
106:	learn: 0.0286021	total: 4.91s	remaining: 4.27s
107:	learn: 0.0285026	total: 4.96s	remaining: 4.22s
108:	learn: 0.0284256	total: 5s	remaining: 4.17s
109:	learn: 0.0283337	total: 5.04s	remaining: 4.12s
110:	learn: 0.0280882	total: 5.08s	remaining: 4.07s
111:	learn: 0.0279290	total: 5.12s	remaining: 4.02s
112:	learn: 0.0278068	total: 5.16s	remaining: 3.97s
113:	learn: 0.0276892	total: 5.2s	remaining: 3.92s
114:	learn: 0.0275388	total: 5.24s	remaining: 3.87s
115:	learn: 0.0273481	total: 5.29s	remaining: 3.83s
116:	learn: 0.0271161	total: 5.33s	remaining: 3.78s
117:	learn: 0.0269804	total: 5.37s	remaining: 3.73s
118:	learn: 0.0268626	total: 5.42s	remaining: 3.69s
119:	learn: 0.0267858	total: 5.46s	remaining: 3.64s
120:	learn: 0.0267047	total: 5.5s	remaining: 3.59s
121:	learn: 0.0265875	total: 5.54s	remaining: 3.54s
122:	learn: 0.0264125	total: 5.59s	remaining: 3.5s
123:	learn: 0.0263485	total: 5.63s	remaining: 3.45s
124:	learn: 0.0262581	total: 5.67s	remaining: 3.4s
125:	learn: 0.0261619	total: 5.71s	remaining: 3.35s
126:	learn: 0.0260497	total: 5.76s	remaining: 3.31s
127:	learn: 0.0258186	total: 5.82s	remaining: 3.27s
128:	learn: 0.0255868	total: 5.87s	remaining: 3.23s
129:	learn: 0.0253958	total: 5.9s	remaining: 3.18s
130:	learn: 0.0252135	total: 5.95s	remaining: 3.13s
131:	learn: 0.0251652	total: 5.99s	remaining: 3.08s
132:	learn: 0.0250476	total: 6.04s	remaining: 3.04s
133:	learn: 0.0249213	total: 6.08s	remaining: 2.99s
134:	learn: 0.0248380	total: 6.12s	remaining: 2.95s
135:	learn: 0.0245761	total: 6.16s	remaining: 2.9s
136:	learn: 0.0244225	total: 6.2s	remaining: 2.85s
137:	learn: 0.0243143	total: 6.25s	remaining: 2.81s
138:	learn: 0.0240709	total: 6.3s	remaining: 2.76s
139:	learn: 0.0239441	total: 6.34s	remaining: 2.72s
140:	learn: 0.0237849	total: 6.38s	remaining: 2.67s
141:	learn: 0.0236021	total: 6.43s	remaining: 2.63s
142:	learn: 0.0234139	total: 6.48s	remaining: 2.58s
143:	learn: 0.0232802	total: 6.52s	remaining: 2.54s
144:	learn: 0.0231769	total: 6.56s	remaining: 2.49s
145:	learn: 0.0231235	total: 6.6s	remaining: 2.44s
146:	learn: 0.0230675	total: 6.65s	remaining: 2.4s
147:	learn: 0.0229474	total: 6.7s	remaining: 2.35s
148:	learn: 0.0227931	total: 6.74s	remaining: 2.31s
149:	learn: 0.0226569	total: 6.78s	remaining: 2.26s
150:	learn: 0.0224705	total: 6.83s	remaining: 2.21s
151:	learn: 0.0223210	total: 6.88s	remaining: 2.17s
152:	learn: 0.0222215	total: 6.93s	remaining: 2.13s
153:	learn: 0.0221621	total: 6.97s	remaining: 2.08s
154:	learn: 0.0220741	total: 7.01s	remaining: 2.04s
155:	learn: 0.0219989	total: 7.05s	remaining: 1.99s
156:	learn: 0.0218791	total: 7.09s	remaining: 1.94s
157:	learn: 0.0217268	total: 7.13s	remaining: 1.9s
158:	learn: 0.0216569	total: 7.18s	remaining: 1.85s
159:	learn: 0.0214216	total: 7.22s	remaining: 1.8s
160:	learn: 0.0213121	total: 7.26s	remaining: 1.76s
161:	learn: 0.0212198	total: 7.31s	remaining: 1.71s
162:	learn: 0.0210895	total: 7.35s	remaining: 1.67s
163:	learn: 0.0210261	total: 7.39s	remaining: 1.62s
164:	learn: 0.0209608	total: 7.43s	remaining: 1.58s
165:	learn: 0.0208493	total: 7.47s	remaining: 1.53s
166:	learn: 0.0206929	total: 7.51s	remaining: 1.48s
167:	learn: 0.0205467	total: 7.55s	remaining: 1.44s
168:	learn: 0.0203774	total: 7.6s	remaining: 1.39s
169:	learn: 0.0202755	total: 7.64s	remaining: 1.35s
170:	learn: 0.0200793	total: 7.69s	remaining: 1.3s
171:	learn: 0.0198838	total: 7.73s	remaining: 1.26s
172:	learn: 0.0198060	total: 7.77s	remaining: 1.21s
173:	learn: 0.0196588	total: 7.82s	remaining: 1.17s
174:	learn: 0.0194866	total: 7.88s	remaining: 1.13s
175:	learn: 0.0192845	total: 7.93s	remaining: 1.08s
176:	learn: 0.0192152	total: 7.97s	remaining: 1.03s
177:	learn: 0.0191439	total: 8.01s	remaining: 990ms
178:	learn: 0.0190554	total: 8.06s	remaining: 945ms
179:	learn: 0.0189153	total: 8.1s	remaining: 900ms
180:	learn: 0.0187751	total: 8.14s	remaining: 855ms
181:	learn: 0.0186882	total: 8.19s	remaining: 810ms
182:	learn: 0.0185240	total: 8.23s	remaining: 764ms
183:	learn: 0.0184201	total: 8.28s	remaining: 720ms
184:	learn: 0.0183397	total: 8.32s	remaining: 675ms
185:	learn: 0.0182099	total: 8.36s	remaining: 629ms
186:	learn: 0.0180553	total: 8.41s	remaining: 584ms
187:	learn: 0.0178895	total: 8.45s	remaining: 540ms
188:	learn: 0.0177106	total: 8.5s	remaining: 495ms
189:	learn: 0.0176026	total: 8.55s	remaining: 450ms
190:	learn: 0.0174989	total: 8.59s	remaining: 405ms
191:	learn: 0.0174182	total: 8.63s	remaining: 360ms
192:	learn: 0.0172674	total: 8.68s	remaining: 315ms
193:	learn: 0.0171479	total: 8.72s	remaining: 270ms
194:	learn: 0.0170586	total: 8.76s	remaining: 225ms
195:	learn: 0.0169856	total: 8.81s	remaining: 180ms
196:	learn: 0.0168690	total: 8.85s	remaining: 135ms
197:	learn: 0.0167926	total: 8.91s	remaining: 90ms
198:	learn: 0.0167337	total: 8.96s	remaining: 45ms
199:	learn: 0.0166163	total: 9s	remaining: 0us

✅ Resultados para CatBoost (GridSearchCV con SMOTE):
 - Modelo: CatBoost
 - Tecnica: GridSearchCV con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 72.07
 - Recall_Train: 95.70
 - AUPRC_Train: 89.09
 - Accuracy_Train: 99.93
 - F1-Score_Train: 82.22
 - Precision_Test: 68.13
 - Recall_Test: 81.58
 - AUPRC_Test: 75.57
 - Accuracy_Test: 99.90
 - F1-Score_Test: 74.25
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: 4
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: 200
 - max_depth: None
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: None
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: [1, 50]
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: None
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 134)

📊 Consolidando y ordenando resultados...

🏆 Resultados Finales Ordenados:
     Modelo                 Tecnica Sobreajuste  Precision_Train  \
0   XGBoost  GridSearchCV con SMOTE           0      85.71428571   
1  CatBoost  GridSearchCV con SMOTE           1      72.06982544   

   Recall_Train  AUPRC_Train  Accuracy_Train  F1-Score_Train  Precision_Test  \
0   83.44370861  82.02573556     99.94882464     84.56375839     86.15384615   
1   95.69536424  89.08783454     99.93046825     82.21906117     68.13186813   

   Recall_Test  AUPRC_Test  Accuracy_Test  F1-Score_Test  learning_rate  \
0  73.68421053 76.72330556    99.93547526    79.43262411            NaN   
1  81.57894737 75.56758375    99.90432538    74.25149701     0.05000000   

  max_depth n_estimators scale_pos_weight iterations depth class_weights Fold  \
0       NaN          NaN              NaN        NaN   NaN           NaN  NaN   
1      None         None             None        200     4       [1, 50]  NaN   

  objective kwargs model_shrink_mode per_feature_ctr   eta devices verbose  \
0      None   None               NaN             NaN   NaN     NaN     NaN   
1      None    NaN              None            None  None    None    None   

  per_object_feature_penalties allow_const_label mvs_reg  \
0                          NaN               NaN     NaN   
1                         None              None    None   

  dev_score_calc_obj_block_size ctr_leaf_count_limit max_ctr_complexity  \
0                           NaN                  NaN                NaN   
1                          None                 None               None   

  target_border metric_period eval_fraction allow_writing_files save_snapshot  \
0           NaN           NaN           NaN                 NaN           NaN   
1          None          None          None                None          None   

  classes_count ctr_description leaf_estimation_method one_hot_max_size  \
0           NaN             NaN                    NaN              NaN   
1          None            None                   None             None   

  min_data_in_leaf random_score_type colsample_bylevel bootstrap_type  \
0              NaN               NaN               NaN            NaN   
1             None              None              None           None   

  custom_metric thread_count bagging_temperature random_strength nan_mode  \
0           NaN          NaN                 NaN             NaN      NaN   
1          None         None                None            None     None   

  text_features per_float_feature_quantization simple_ctr output_borders  \
0           NaN                            NaN        NaN            NaN   
1          None                           None       None           None   

  use_best_model gpu_cat_features_storage combinations_ctr border_count  \
0            NaN                      NaN              NaN          NaN   
1           None                     None             None         None   

  feature_border_type data_partition fold_permutation_block od_pval  name  \
0                 NaN            NaN                    NaN     NaN   NaN   
1                None           None                   None    None  None   

  early_stopping_rounds tokenizers best_model_min_trees dev_efb_max_buckets  \
0                   NaN        NaN                  NaN                 NaN   
1                  None       None                 None                None   

  feature_weights posterior_sampling metadata boosting_type  \
0             NaN                NaN      NaN           NaN   
1            None               None     None          None   

  diffusion_temperature gpu_ram_part score_function approx_on_full_history  \
0                   NaN          NaN            NaN                    NaN   
1                  None         None           None                   None   

  sampling_unit task_type snapshot_interval   rsm store_all_simple_ctr  \
0           NaN       NaN               NaN   NaN                  NaN   
1          None      None              None  None                 None   

  random_seed sampling_frequency ctr_target_border_count  \
0         NaN                NaN                     NaN   
1        None               None                    None   

  final_ctr_computation_mode fixed_binary_splits subsample auto_class_weights  \
0                        NaN                 NaN       NaN                NaN   
1                       None                None      None               None   

  ctr_history_unit device_config leaf_estimation_backtracking l2_leaf_reg  \
0              NaN           NaN                          NaN         NaN   
1             None          None                         None        None   

  has_time fold_len_multiplier pinned_memory_size feature_calcers  \
0      NaN                 NaN                NaN             NaN   
1     None                None               None            None   

  model_shrink_rate od_type monotone_constraints dictionaries max_bin  \
0               NaN     NaN                  NaN          NaN     NaN   
1              None    None                 None         None    None   

  boost_from_average grow_policy embedding_features langevin callback  \
0                NaN         NaN                NaN      NaN      NaN   
1               None        None               None     None     None   

  cat_features train_dir sparse_features_conflict_fraction ignored_features  \
0          NaN       NaN                               NaN              NaN   
1         None      None                              None             None   

  num_trees penalties_coefficient used_ram_limit text_processing reg_lambda  \
0       NaN                   NaN            NaN             NaN        NaN   
1      None                  None           None            None       None   

  snapshot_file random_state custom_loss loss_function  \
0           NaN          NaN         NaN           NaN   
1          None         None        None          None   

  leaf_estimation_iterations silent max_leaves input_borders  \
0                        NaN    NaN        NaN           NaN   
1                       None   None       None          None   

  counter_calc_method num_boost_round model_size_reg eval_metric num_leaves  \
0                 NaN             NaN            NaN         NaN        NaN   
1                None            None           None        None       None   

  min_child_samples class_names logging_level first_feature_use_penalties  \
0               NaN         NaN           NaN                         NaN   
1              None        None          None                        None   

  od_wait  
0     NaN  
1    None  

✅ Resultados guardados en 'resultados_gridsearch_con_smote.csv'
CPU times: user 1min 16s, sys: 3.65 s, total: 1min 19s
Wall time: 13min 57s

5. Optimización de Hiperparámetros con Optuna (con SMOTE)¶

Optimización de Hiperparámetros con Optuna:

  • Se aplica SMOTE en cada iteración de validación cruzada para equilibrar los datos.
  • La función optimizar_optuna ajusta los hiperparámetros de los modelos buscando maximizar la métrica AUPRC.
  • Se registran las métricas clave y los mejores parámetros para cada modelo. Los resultados finales se ordenan, filtran y se guardan en un archivo CSV.

Optuna es más eficiente que GridSearchCV porque utiliza búsqueda bayesiana para explorar el espacio de hiperparámetros de manera inteligente.

In [ ]:
# ==========================================================
# Optimización de Hiperparámetros con Optuna y SMOTE
# ==========================================================
import optuna
import numpy as np
from sklearn.model_selection import StratifiedKFold
from imblearn.over_sampling import SMOTE
from xgboost import XGBClassifier
from catboost import CatBoostClassifier
import pandas as pd

# ==============================
# 1. Inicialización del DataFrame de Resultados
# ==============================
columnas_resultados_optuna = [
    'Modelo', 'Tecnica', 'Fold', 'Sobreajuste',
    'Precision_Train', 'Recall_Train', 'AUPRC_Train', 'Accuracy_Train', 'F1-Score_Train',
    'Precision_Test', 'Recall_Test', 'AUPRC_Test', 'Accuracy_Test', 'F1-Score_Test',
    'learning_rate', 'max_depth', 'n_estimators', 'scale_pos_weight', 'iterations', 'class_weights'
]
resultados_optuna = pd.DataFrame(columns=columnas_resultados_optuna)

# ==============================
# 2. Función optimizar_optuna adaptada
# ==============================
def optimizar_optuna(trial, modelo, nombre_modelo, X_train, y_train, resultados_df):
    """
    Optimiza hiperparámetros usando Optuna y valida con SMOTE y StratifiedKFold.
    Guarda todas las métricas y parámetros en un DataFrame.
    """
    print(f"\n🔍 Optimizando hiperparámetros para {nombre_modelo} con Optuna...")

    # Hiperparámetros sugeridos por Optuna
    parametros = {
        'learning_rate': trial.suggest_loguniform('learning_rate', 0.01, 0.1),
        'max_depth': trial.suggest_int('max_depth', 3, 6),
        'n_estimators': trial.suggest_int('n_estimators', 100, 300),
        'scale_pos_weight': trial.suggest_float('scale_pos_weight', 5, 15),
    }

    kfold = StratifiedKFold(n_splits=3, shuffle=True, random_state=42)
    scores = []

    # Evaluación en cada fold
    for fold, (train_idx, test_idx) in enumerate(kfold.split(X_train, y_train)):
        print(f"\n🔄 Fold {fold + 1}: Optimización en progreso...")

        # Dividir datos
        X_train_fold, X_test_fold = X_train.iloc[train_idx], X_train.iloc[test_idx]
        y_train_fold, y_test_fold = y_train.iloc[train_idx], y_train.iloc[test_idx]

        # Aplicar SMOTE
        print(f"📊 Antes de SMOTE: {y_train_fold.value_counts(normalize=True)}")
        smote = SMOTE(random_state=42)
        X_train_res, y_train_res = smote.fit_resample(X_train_fold, y_train_fold)
        print(f"📈 Después de SMOTE: {y_train_res.value_counts(normalize=True)}")

        # Entrenar y evaluar el modelo
        resultados_df = entrenar_y_evaluar(
            modelo=modelo,
            nombre_modelo=nombre_modelo,
            parametros=parametros,
            X_train=X_train_res,
            y_train=y_train_res,
            X_test=X_test_fold,
            y_test=y_test_fold,
            tecnica="Optuna con SMOTE",
            resultados_df=resultados_df
        )

        # Guardar la métrica AUPRC
        score = resultados_df['AUPRC_Test'].iloc[-1]
        scores.append(score)

    # Promedio de AUPRC en los folds
    mean_score = np.mean(scores)
    print(f"\n🏆 Promedio de AUPRC en validación cruzada: {mean_score:.4f}")

    return mean_score

# ==============================
# 3. Optimización con Optuna
# ==============================

# ==============================
# Optuna para XGBoost
# ==============================
print("\n🚀 Optimización de Hiperparámetros con Optuna para XGBoost...")
study_xgb = optuna.create_study(direction='maximize')
study_xgb.optimize(
    lambda trial: optimizar_optuna(trial, XGBClassifier, "XGBoost", X_train, y_train, resultados_optuna),
    n_trials=50
)

# ==============================
# Optuna para CatBoost
# ==============================
print("\n🚀 Optimización de Hiperparámetros con Optuna para CatBoost...")
study_catboost = optuna.create_study(direction='maximize')
study_catboost.optimize(
    lambda trial: optimizar_optuna(trial, CatBoostClassifier, "CatBoost", X_train, y_train, resultados_optuna),
    n_trials=50
)

# ==============================
# 4. Consolidar y Guardar Resultados
# ==============================
print("\n🔍 Consolidando y ordenando resultados...")
resultados_ordenados = resultados_optuna.sort_values(
    by=['AUPRC_Test', 'Recall_Test', 'Precision_Test', 'F1-Score_Test'],
    ascending=[False, False, False, False]
)

print("\n🏆 Resultados Finales Ordenados:")
print(resultados_ordenados)

# Guardar resultados
output_file = "resultados_optuna_con_smote.csv"
resultados_ordenados.to_csv(output_file, index=False)
print(f"\n✅ Resultados guardados en '{output_file}'")
[I 2024-12-19 13:22:34,041] A new study created in memory with name: no-name-8e0e9dbf-6367-41a2-aecd-248cd5ebcaaf
🚀 Optimización de Hiperparámetros con Optuna para XGBoost...

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.65
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 98.79
 - F1-Score_Train: 98.81
 - Precision_Test: 5.66
 - Recall_Test: 89.68
 - AUPRC_Test: 76.96
 - Accuracy_Test: 97.47
 - F1-Score_Test: 10.65
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.64
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 98.26
 - F1-Score_Train: 98.29
 - Precision_Test: 4.38
 - Recall_Test: 94.44
 - AUPRC_Test: 76.63
 - Accuracy_Test: 96.52
 - F1-Score_Test: 8.38
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:23:02,583] Trial 0 finished with value: 76.27242310152417 and parameters: {'learning_rate': 0.06946814045829992, 'max_depth': 4, 'n_estimators': 111, 'scale_pos_weight': 10.849281219534213}. Best is trial 0 with value: 76.27242310152417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.32
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.62
 - F1-Score_Train: 98.64
 - Precision_Test: 5.13
 - Recall_Test: 88.89
 - AUPRC_Test: 75.22
 - Accuracy_Test: 97.22
 - F1-Score_Test: 9.70
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 76.2724

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.58
 - Recall_Train: 99.71
 - AUPRC_Train: 99.81
 - Accuracy_Train: 97.00
 - F1-Score_Train: 97.08
 - Precision_Test: 2.55
 - Recall_Test: 90.48
 - AUPRC_Test: 75.17
 - Accuracy_Test: 94.17
 - F1-Score_Test: 4.96
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.55
 - Recall_Train: 98.85
 - AUPRC_Train: 99.68
 - Accuracy_Train: 96.02
 - F1-Score_Train: 96.13
 - Precision_Test: 2.31
 - Recall_Test: 95.24
 - AUPRC_Test: 68.69
 - Accuracy_Test: 93.22
 - F1-Score_Test: 4.51
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:23:39,083] Trial 1 finished with value: 73.28403168120853 and parameters: {'learning_rate': 0.01877718389352851, 'max_depth': 3, 'n_estimators': 215, 'scale_pos_weight': 5.1617134561936755}. Best is trial 0 with value: 76.27242310152417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.55
 - Recall_Train: 99.07
 - AUPRC_Train: 99.73
 - Accuracy_Train: 96.68
 - F1-Score_Train: 96.76
 - Precision_Test: 2.61
 - Recall_Test: 91.27
 - AUPRC_Test: 75.99
 - Accuracy_Test: 94.25
 - F1-Score_Test: 5.07
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 73.2840

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.44
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.21
 - F1-Score_Train: 99.21
 - Precision_Test: 8.47
 - Recall_Test: 88.89
 - AUPRC_Test: 79.00
 - Accuracy_Test: 98.36
 - F1-Score_Test: 15.46
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.60
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 98.77
 - F1-Score_Train: 98.79
 - Precision_Test: 6.04
 - Recall_Test: 93.65
 - AUPRC_Test: 77.51
 - Accuracy_Test: 97.54
 - F1-Score_Test: 11.35
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:24:13,357] Trial 2 finished with value: 79.11833773313246 and parameters: {'learning_rate': 0.09268097642588863, 'max_depth': 3, 'n_estimators': 181, 'scale_pos_weight': 12.430971158144311}. Best is trial 2 with value: 79.11833773313246.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.06
 - F1-Score_Train: 99.07
 - Precision_Test: 7.36
 - Recall_Test: 88.89
 - AUPRC_Test: 80.85
 - Accuracy_Test: 98.10
 - F1-Score_Test: 13.59
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 79.1183

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.50
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 19.12
 - Recall_Test: 86.51
 - AUPRC_Test: 81.69
 - Accuracy_Test: 99.36
 - F1-Score_Test: 31.32
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.01
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.50
 - F1-Score_Train: 99.50
 - Precision_Test: 12.39
 - Recall_Test: 89.68
 - AUPRC_Test: 79.80
 - Accuracy_Test: 98.92
 - F1-Score_Test: 21.77
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:25:14,604] Trial 3 finished with value: 81.49583968883515 and parameters: {'learning_rate': 0.05439196518814538, 'max_depth': 4, 'n_estimators': 279, 'scale_pos_weight': 10.400533011658881}. Best is trial 3 with value: 81.49583968883515.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.31
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.65
 - F1-Score_Train: 99.65
 - Precision_Test: 16.52
 - Recall_Test: 87.30
 - AUPRC_Test: 82.99
 - Accuracy_Test: 99.24
 - F1-Score_Test: 27.78
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 81.4958

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.65
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.83
 - Precision_Test: 22.55
 - Recall_Test: 85.71
 - AUPRC_Test: 82.84
 - Accuracy_Test: 99.48
 - F1-Score_Test: 35.70
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.51
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 20.64
 - Recall_Test: 86.51
 - AUPRC_Test: 79.96
 - Accuracy_Test: 99.42
 - F1-Score_Test: 33.33
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:26:06,517] Trial 4 finished with value: 82.01437290523499 and parameters: {'learning_rate': 0.04601648510038596, 'max_depth': 6, 'n_estimators': 185, 'scale_pos_weight': 7.5841982122007}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.67
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 26.33
 - Recall_Test: 86.51
 - AUPRC_Test: 83.24
 - Accuracy_Test: 99.57
 - F1-Score_Test: 40.37
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 82.0144

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.64
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.82
 - Precision_Test: 23.34
 - Recall_Test: 86.51
 - AUPRC_Test: 81.07
 - Accuracy_Test: 99.50
 - F1-Score_Test: 36.76
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.38
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.69
 - F1-Score_Train: 99.69
 - Precision_Test: 17.32
 - Recall_Test: 88.10
 - AUPRC_Test: 79.45
 - Accuracy_Test: 99.27
 - F1-Score_Test: 28.94
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:26:54,135] Trial 5 finished with value: 81.5907783938854 and parameters: {'learning_rate': 0.0667410689205582, 'max_depth': 4, 'n_estimators': 248, 'scale_pos_weight': 5.874528406175232}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.56
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.78
 - F1-Score_Train: 99.78
 - Precision_Test: 22.63
 - Recall_Test: 87.30
 - AUPRC_Test: 84.26
 - Accuracy_Test: 99.48
 - F1-Score_Test: 35.95
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 81.5908

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.87
 - Recall_Train: 99.98
 - AUPRC_Train: 99.89
 - Accuracy_Train: 96.72
 - F1-Score_Train: 96.83
 - Precision_Test: 2.32
 - Recall_Test: 92.06
 - AUPRC_Test: 75.75
 - Accuracy_Test: 93.48
 - F1-Score_Test: 4.53
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.89
 - Recall_Train: 99.92
 - AUPRC_Train: 99.85
 - Accuracy_Train: 94.95
 - F1-Score_Train: 95.19
 - Precision_Test: 1.60
 - Recall_Test: 96.03
 - AUPRC_Test: 71.41
 - Accuracy_Test: 90.04
 - F1-Score_Test: 3.14
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:27:42,546] Trial 6 finished with value: 73.8456849423154 and parameters: {'learning_rate': 0.01294761187895525, 'max_depth': 4, 'n_estimators': 263, 'scale_pos_weight': 10.762658924116758}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.04
 - Recall_Train: 99.91
 - AUPRC_Train: 99.85
 - Accuracy_Train: 96.79
 - F1-Score_Train: 96.89
 - Precision_Test: 2.38
 - Recall_Test: 91.27
 - AUPRC_Test: 74.38
 - Accuracy_Test: 93.69
 - F1-Score_Test: 4.64
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 73.8457

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.57
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.78
 - F1-Score_Train: 99.78
 - Precision_Test: 19.71
 - Recall_Test: 85.71
 - AUPRC_Test: 83.03
 - Accuracy_Test: 99.39
 - F1-Score_Test: 32.05
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.36
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.68
 - F1-Score_Train: 99.68
 - Precision_Test: 16.62
 - Recall_Test: 89.68
 - AUPRC_Test: 75.79
 - Accuracy_Test: 99.23
 - F1-Score_Test: 28.04
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:28:22,964] Trial 7 finished with value: 80.86364697835926 and parameters: {'learning_rate': 0.058899489581781504, 'max_depth': 6, 'n_estimators': 145, 'scale_pos_weight': 12.180673714669759}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.62
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.81
 - F1-Score_Train: 99.81
 - Precision_Test: 23.01
 - Recall_Test: 87.30
 - AUPRC_Test: 83.77
 - Accuracy_Test: 99.49
 - F1-Score_Test: 36.42
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 80.8636

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.58
 - Recall_Train: 99.97
 - AUPRC_Train: 99.87
 - Accuracy_Train: 96.55
 - F1-Score_Train: 96.67
 - Precision_Test: 2.24
 - Recall_Test: 92.06
 - AUPRC_Test: 75.03
 - Accuracy_Test: 93.23
 - F1-Score_Test: 4.38
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.60
 - Recall_Train: 99.80
 - AUPRC_Train: 99.80
 - Accuracy_Train: 94.72
 - F1-Score_Train: 94.98
 - Precision_Test: 1.54
 - Recall_Test: 96.03
 - AUPRC_Test: 71.26
 - Accuracy_Test: 89.67
 - F1-Score_Test: 3.03
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:29:07,126] Trial 8 finished with value: 73.76349796255441 and parameters: {'learning_rate': 0.014611808114790776, 'max_depth': 4, 'n_estimators': 188, 'scale_pos_weight': 8.772981912569543}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.67
 - Recall_Train: 99.73
 - AUPRC_Train: 99.81
 - Accuracy_Train: 96.50
 - F1-Score_Train: 96.61
 - Precision_Test: 2.25
 - Recall_Test: 92.06
 - AUPRC_Test: 75.01
 - Accuracy_Test: 93.25
 - F1-Score_Test: 4.39
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 73.7635

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.64
 - Recall_Train: 99.34
 - AUPRC_Train: 99.69
 - Accuracy_Train: 95.72
 - F1-Score_Train: 95.87
 - Precision_Test: 1.92
 - Recall_Test: 92.06
 - AUPRC_Test: 74.74
 - Accuracy_Test: 92.07
 - F1-Score_Test: 3.76
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.41
 - Recall_Train: 98.39
 - AUPRC_Train: 99.42
 - Accuracy_Train: 93.98
 - F1-Score_Train: 94.23
 - Precision_Test: 1.52
 - Recall_Test: 95.24
 - AUPRC_Test: 68.79
 - Accuracy_Test: 89.62
 - F1-Score_Test: 3.00
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:29:45,569] Trial 9 finished with value: 72.95595731430997 and parameters: {'learning_rate': 0.010533653572554823, 'max_depth': 3, 'n_estimators': 213, 'scale_pos_weight': 5.0835849540438085}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.45
 - Recall_Train: 98.63
 - AUPRC_Train: 99.57
 - Accuracy_Train: 95.29
 - F1-Score_Train: 95.44
 - Precision_Test: 1.88
 - Recall_Test: 91.27
 - AUPRC_Test: 75.33
 - Accuracy_Test: 91.95
 - F1-Score_Test: 3.68
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 72.9560

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.86
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.42
 - F1-Score_Train: 99.43
 - Precision_Test: 10.43
 - Recall_Test: 86.51
 - AUPRC_Test: 80.33
 - Accuracy_Test: 98.73
 - F1-Score_Test: 18.62
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.49
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.23
 - F1-Score_Train: 99.24
 - Precision_Test: 8.75
 - Recall_Test: 92.06
 - AUPRC_Test: 73.13
 - Accuracy_Test: 98.37
 - F1-Score_Test: 15.98
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:30:23,705] Trial 10 finished with value: 78.40481136872704 and parameters: {'learning_rate': 0.036192010681496446, 'max_depth': 6, 'n_estimators': 148, 'scale_pos_weight': 7.939258756385188}. Best is trial 4 with value: 82.01437290523499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.45
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.21
 - F1-Score_Train: 99.22
 - Precision_Test: 8.19
 - Recall_Test: 88.89
 - AUPRC_Test: 81.76
 - Accuracy_Test: 98.30
 - F1-Score_Test: 14.99
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 78.4048

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.29
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.64
 - F1-Score_Train: 99.64
 - Precision_Test: 15.10
 - Recall_Test: 86.51
 - AUPRC_Test: 82.92
 - Accuracy_Test: 99.16
 - F1-Score_Test: 25.71
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.91
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.45
 - F1-Score_Train: 99.45
 - Precision_Test: 11.40
 - Recall_Test: 91.27
 - AUPRC_Test: 81.19
 - Accuracy_Test: 98.79
 - F1-Score_Test: 20.26
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:31:16,030] Trial 11 finished with value: 82.58523136831417 and parameters: {'learning_rate': 0.03586062571213978, 'max_depth': 5, 'n_estimators': 245, 'scale_pos_weight': 7.084249372695452}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.57
 - F1-Score_Train: 99.57
 - Precision_Test: 14.23
 - Recall_Test: 88.10
 - AUPRC_Test: 83.65
 - Accuracy_Test: 99.09
 - F1-Score_Test: 24.50
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 82.5852

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.49
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.74
 - F1-Score_Train: 99.74
 - Precision_Test: 18.51
 - Recall_Test: 86.51
 - AUPRC_Test: 82.42
 - Accuracy_Test: 99.34
 - F1-Score_Test: 30.49
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.21
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.60
 - F1-Score_Train: 99.60
 - Precision_Test: 14.85
 - Recall_Test: 89.68
 - AUPRC_Test: 80.11
 - Accuracy_Test: 99.12
 - F1-Score_Test: 25.48
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:32:19,376] Trial 12 finished with value: 82.12714710267731 and parameters: {'learning_rate': 0.03370930150726187, 'max_depth': 5, 'n_estimators': 299, 'scale_pos_weight': 7.479130572725812}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.39
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.70
 - F1-Score_Train: 99.70
 - Precision_Test: 17.92
 - Recall_Test: 87.30
 - AUPRC_Test: 83.85
 - Accuracy_Test: 99.31
 - F1-Score_Test: 29.73
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 82.1271

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.48
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.23
 - F1-Score_Train: 99.23
 - Precision_Test: 8.42
 - Recall_Test: 88.10
 - AUPRC_Test: 82.18
 - Accuracy_Test: 98.37
 - F1-Score_Test: 15.36
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.98
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 98.97
 - F1-Score_Train: 98.98
 - Precision_Test: 6.80
 - Recall_Test: 92.06
 - AUPRC_Test: 75.95
 - Accuracy_Test: 97.86
 - F1-Score_Test: 12.66
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:33:23,926] Trial 13 finished with value: 80.18052914898705 and parameters: {'learning_rate': 0.025441231770727296, 'max_depth': 5, 'n_estimators': 297, 'scale_pos_weight': 14.447566507188089}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.47
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.22
 - F1-Score_Train: 99.23
 - Precision_Test: 8.56
 - Recall_Test: 89.68
 - AUPRC_Test: 82.42
 - Accuracy_Test: 98.37
 - F1-Score_Test: 15.63
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 80.1805

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.57
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.27
 - F1-Score_Train: 99.28
 - Precision_Test: 8.55
 - Recall_Test: 87.30
 - AUPRC_Test: 82.35
 - Accuracy_Test: 98.41
 - F1-Score_Test: 15.57
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.23
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.10
 - F1-Score_Train: 99.11
 - Precision_Test: 7.56
 - Recall_Test: 91.27
 - AUPRC_Test: 76.90
 - Accuracy_Test: 98.11
 - F1-Score_Test: 13.96
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:34:15,627] Trial 14 finished with value: 80.87054555631939 and parameters: {'learning_rate': 0.02768805545279407, 'max_depth': 5, 'n_estimators': 240, 'scale_pos_weight': 7.082951766515296}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.39
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.18
 - F1-Score_Train: 99.19
 - Precision_Test: 8.30
 - Recall_Test: 88.89
 - AUPRC_Test: 83.36
 - Accuracy_Test: 98.33
 - F1-Score_Test: 15.18
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 80.8705

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.51
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 19.06
 - Recall_Test: 86.51
 - AUPRC_Test: 82.35
 - Accuracy_Test: 99.36
 - F1-Score_Test: 31.23
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.17
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.58
 - F1-Score_Train: 99.58
 - Precision_Test: 13.73
 - Recall_Test: 89.68
 - AUPRC_Test: 76.92
 - Accuracy_Test: 99.03
 - F1-Score_Test: 23.81
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:35:17,088] Trial 15 finished with value: 81.09874115491728 and parameters: {'learning_rate': 0.03530942079438228, 'max_depth': 5, 'n_estimators': 295, 'scale_pos_weight': 9.173936003128407}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.47
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.73
 - F1-Score_Train: 99.73
 - Precision_Test: 19.82
 - Recall_Test: 87.30
 - AUPRC_Test: 84.02
 - Accuracy_Test: 99.38
 - F1-Score_Test: 32.31
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 81.0987

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.94
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 98.95
 - F1-Score_Train: 98.96
 - Precision_Test: 6.19
 - Recall_Test: 87.30
 - AUPRC_Test: 81.91
 - Accuracy_Test: 97.75
 - F1-Score_Test: 11.55
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.41
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 98.67
 - F1-Score_Train: 98.69
 - Precision_Test: 5.53
 - Recall_Test: 93.65
 - AUPRC_Test: 73.88
 - Accuracy_Test: 97.30
 - F1-Score_Test: 10.44
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:36:09,947] Trial 16 finished with value: 78.45864216500667 and parameters: {'learning_rate': 0.02120983226026619, 'max_depth': 5, 'n_estimators': 240, 'scale_pos_weight': 6.661848471397413}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.40
 - Recall_Train: 99.96
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.65
 - F1-Score_Train: 98.67
 - Precision_Test: 5.33
 - Recall_Test: 90.48
 - AUPRC_Test: 79.59
 - Accuracy_Test: 97.28
 - F1-Score_Test: 10.08
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 78.4586

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.61
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.80
 - F1-Score_Train: 99.80
 - Precision_Test: 21.17
 - Recall_Test: 86.51
 - AUPRC_Test: 82.19
 - Accuracy_Test: 99.44
 - F1-Score_Test: 34.01
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.38
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.69
 - F1-Score_Train: 99.69
 - Precision_Test: 16.97
 - Recall_Test: 88.10
 - AUPRC_Test: 81.51
 - Accuracy_Test: 99.26
 - F1-Score_Test: 28.46
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:37:11,877] Trial 17 finished with value: 82.43156920418862 and parameters: {'learning_rate': 0.04215820386765781, 'max_depth': 5, 'n_estimators': 267, 'scale_pos_weight': 9.240204207303538}. Best is trial 11 with value: 82.58523136831417.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.59
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.79
 - F1-Score_Train: 99.79
 - Precision_Test: 23.21
 - Recall_Test: 87.30
 - AUPRC_Test: 83.59
 - Accuracy_Test: 99.49
 - F1-Score_Test: 36.67
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 82.4316

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.86
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.93
 - F1-Score_Train: 99.93
 - Precision_Test: 34.64
 - Recall_Test: 84.13
 - AUPRC_Test: 82.78
 - Accuracy_Test: 99.71
 - F1-Score_Test: 49.07
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.76
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 30.36
 - Recall_Test: 86.51
 - AUPRC_Test: 82.66
 - Accuracy_Test: 99.64
 - F1-Score_Test: 44.95
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:38:28,562] Trial 18 finished with value: 83.52911606459935 and parameters: {'learning_rate': 0.04229951808738263, 'max_depth': 6, 'n_estimators': 264, 'scale_pos_weight': 9.29917634419246}. Best is trial 18 with value: 83.52911606459935.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.84
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.92
 - F1-Score_Train: 99.92
 - Precision_Test: 39.57
 - Recall_Test: 87.30
 - AUPRC_Test: 85.14
 - Accuracy_Test: 99.75
 - F1-Score_Test: 54.46
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.5291

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 62.13
 - Recall_Test: 83.33
 - AUPRC_Test: 83.42
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.19
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 60.23
 - Recall_Test: 84.13
 - AUPRC_Test: 84.19
 - Accuracy_Test: 99.88
 - F1-Score_Test: 70.20
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:39:26,023] Trial 19 finished with value: 84.49389845137142 and parameters: {'learning_rate': 0.08279316655582296, 'max_depth': 6, 'n_estimators': 229, 'scale_pos_weight': 8.602804804198408}. Best is trial 19 with value: 84.49389845137142.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 66.26
 - Recall_Test: 85.71
 - AUPRC_Test: 85.87
 - Accuracy_Test: 99.90
 - F1-Score_Test: 74.74
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.4939

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 66.67
 - Recall_Test: 82.54
 - AUPRC_Test: 83.33
 - Accuracy_Test: 99.90
 - F1-Score_Test: 73.76
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 60.00
 - Recall_Test: 83.33
 - AUPRC_Test: 84.26
 - Accuracy_Test: 99.88
 - F1-Score_Test: 69.77
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:40:23,576] Trial 20 finished with value: 84.3083858957213 and parameters: {'learning_rate': 0.09424993118744852, 'max_depth': 6, 'n_estimators': 219, 'scale_pos_weight': 11.78327088422818}. Best is trial 19 with value: 84.49389845137142.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 66.05
 - Recall_Test: 84.92
 - AUPRC_Test: 85.33
 - Accuracy_Test: 99.90
 - F1-Score_Test: 74.31
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.3084

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 63.25
 - Recall_Test: 83.33
 - AUPRC_Test: 83.30
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.92
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 59.32
 - Recall_Test: 83.33
 - AUPRC_Test: 84.47
 - Accuracy_Test: 99.88
 - F1-Score_Test: 69.31
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:41:21,092] Trial 21 finished with value: 84.47521530220713 and parameters: {'learning_rate': 0.09408982015895889, 'max_depth': 6, 'n_estimators': 220, 'scale_pos_weight': 12.145834253021942}. Best is trial 19 with value: 84.49389845137142.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 69.03
 - Recall_Test: 84.92
 - AUPRC_Test: 85.66
 - Accuracy_Test: 99.91
 - F1-Score_Test: 76.16
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.4752

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 67.74
 - Recall_Test: 83.33
 - AUPRC_Test: 83.32
 - Accuracy_Test: 99.91
 - F1-Score_Test: 74.73
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 61.18
 - Recall_Test: 82.54
 - AUPRC_Test: 84.81
 - Accuracy_Test: 99.88
 - F1-Score_Test: 70.27
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:42:18,125] Trial 22 finished with value: 84.6491098397098 and parameters: {'learning_rate': 0.0997955498264482, 'max_depth': 6, 'n_estimators': 218, 'scale_pos_weight': 12.026471554732682}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 68.15
 - Recall_Test: 84.92
 - AUPRC_Test: 85.82
 - Accuracy_Test: 99.91
 - F1-Score_Test: 75.62
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.6491

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 55.32
 - Recall_Test: 82.54
 - AUPRC_Test: 82.72
 - Accuracy_Test: 99.86
 - F1-Score_Test: 66.24
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.94
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.97
 - F1-Score_Train: 99.97
 - Precision_Test: 47.16
 - Recall_Test: 85.71
 - AUPRC_Test: 84.23
 - Accuracy_Test: 99.81
 - F1-Score_Test: 60.85
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:43:14,673] Trial 23 finished with value: 84.11935425988526 and parameters: {'learning_rate': 0.07967057980924105, 'max_depth': 6, 'n_estimators': 202, 'scale_pos_weight': 13.550715490063864}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 56.32
 - Recall_Test: 84.92
 - AUPRC_Test: 85.40
 - Accuracy_Test: 99.86
 - F1-Score_Test: 67.72
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.1194

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 60.00
 - Recall_Test: 83.33
 - AUPRC_Test: 83.41
 - Accuracy_Test: 99.88
 - F1-Score_Test: 69.77
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 56.91
 - Recall_Test: 84.92
 - AUPRC_Test: 84.71
 - Accuracy_Test: 99.87
 - F1-Score_Test: 68.15
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:44:11,968] Trial 24 finished with value: 84.39839420450207 and parameters: {'learning_rate': 0.08276801762477547, 'max_depth': 6, 'n_estimators': 227, 'scale_pos_weight': 13.320401010709293}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 61.85
 - Recall_Test: 84.92
 - AUPRC_Test: 85.07
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.57
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.3984

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 52.76
 - Recall_Test: 83.33
 - AUPRC_Test: 81.67
 - Accuracy_Test: 99.85
 - F1-Score_Test: 64.62
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.93
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.97
 - F1-Score_Train: 99.97
 - Precision_Test: 47.75
 - Recall_Test: 84.13
 - AUPRC_Test: 84.16
 - Accuracy_Test: 99.82
 - F1-Score_Test: 60.92
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:44:56,911] Trial 25 finished with value: 83.83190614992175 and parameters: {'learning_rate': 0.09757784086285472, 'max_depth': 6, 'n_estimators': 159, 'scale_pos_weight': 11.267136016109033}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 53.47
 - Recall_Test: 85.71
 - AUPRC_Test: 85.67
 - Accuracy_Test: 99.85
 - F1-Score_Test: 65.85
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.8319

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.90
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.95
 - F1-Score_Train: 99.95
 - Precision_Test: 39.47
 - Recall_Test: 83.33
 - AUPRC_Test: 82.23
 - Accuracy_Test: 99.76
 - F1-Score_Test: 53.57
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.85
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.93
 - F1-Score_Train: 99.93
 - Precision_Test: 37.80
 - Recall_Test: 87.30
 - AUPRC_Test: 83.65
 - Accuracy_Test: 99.74
 - F1-Score_Test: 52.76
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:45:42,896] Trial 26 finished with value: 83.41010865385493 and parameters: {'learning_rate': 0.07325877265600715, 'max_depth': 6, 'n_estimators': 170, 'scale_pos_weight': 13.281644174434579}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.89
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.95
 - F1-Score_Train: 99.95
 - Precision_Test: 44.03
 - Recall_Test: 84.92
 - AUPRC_Test: 84.35
 - Accuracy_Test: 99.79
 - F1-Score_Test: 57.99
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.4101

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.83
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.91
 - F1-Score_Train: 99.91
 - Precision_Test: 31.20
 - Recall_Test: 84.92
 - AUPRC_Test: 82.76
 - Accuracy_Test: 99.66
 - F1-Score_Test: 45.63
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.69
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.84
 - F1-Score_Train: 99.84
 - Precision_Test: 26.38
 - Recall_Test: 87.30
 - AUPRC_Test: 82.03
 - Accuracy_Test: 99.57
 - F1-Score_Test: 40.52
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:46:35,176] Trial 27 finished with value: 83.07826899142579 and parameters: {'learning_rate': 0.05386863993627405, 'max_depth': 6, 'n_estimators': 206, 'scale_pos_weight': 14.74357346250575}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.84
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.92
 - F1-Score_Train: 99.92
 - Precision_Test: 37.76
 - Recall_Test: 85.71
 - AUPRC_Test: 84.44
 - Accuracy_Test: 99.74
 - F1-Score_Test: 52.43
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.0783

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 67.74
 - Recall_Test: 83.33
 - AUPRC_Test: 83.65
 - Accuracy_Test: 99.91
 - F1-Score_Test: 74.73
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 64.02
 - Recall_Test: 83.33
 - AUPRC_Test: 85.01
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.41
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:47:37,177] Trial 28 finished with value: 84.6309542107084 and parameters: {'learning_rate': 0.09987853237607923, 'max_depth': 6, 'n_estimators': 229, 'scale_pos_weight': 12.674759843282501}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 65.64
 - Recall_Test: 84.92
 - AUPRC_Test: 85.24
 - Accuracy_Test: 99.90
 - F1-Score_Test: 74.05
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.6310

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.50
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.24
 - F1-Score_Train: 99.24
 - Precision_Test: 8.38
 - Recall_Test: 87.30
 - AUPRC_Test: 82.58
 - Accuracy_Test: 98.37
 - F1-Score_Test: 15.29
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.08
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.02
 - F1-Score_Train: 99.03
 - Precision_Test: 7.14
 - Recall_Test: 92.86
 - AUPRC_Test: 76.74
 - Accuracy_Test: 97.96
 - F1-Score_Test: 13.27
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:48:05,109] Trial 29 finished with value: 80.71202163792437 and parameters: {'learning_rate': 0.06561064528456287, 'max_depth': 5, 'n_estimators': 107, 'scale_pos_weight': 9.950903892133745}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.40
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.19
 - F1-Score_Train: 99.19
 - Precision_Test: 8.15
 - Recall_Test: 88.89
 - AUPRC_Test: 82.82
 - Accuracy_Test: 98.30
 - F1-Score_Test: 14.92
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 80.7120

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.75
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 27.27
 - Recall_Test: 85.71
 - AUPRC_Test: 82.75
 - Accuracy_Test: 99.59
 - F1-Score_Test: 41.38
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.59
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.79
 - F1-Score_Train: 99.79
 - Precision_Test: 22.52
 - Recall_Test: 86.51
 - AUPRC_Test: 78.75
 - Accuracy_Test: 99.48
 - F1-Score_Test: 35.74
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:48:40,583] Trial 30 finished with value: 81.98502239664322 and parameters: {'learning_rate': 0.07913928891684985, 'max_depth': 6, 'n_estimators': 123, 'scale_pos_weight': 11.253706694179165}. Best is trial 22 with value: 84.6491098397098.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.76
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 29.86
 - Recall_Test: 86.51
 - AUPRC_Test: 84.45
 - Accuracy_Test: 99.64
 - F1-Score_Test: 44.40
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 81.9850

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 64.81
 - Recall_Test: 83.33
 - AUPRC_Test: 83.81
 - Accuracy_Test: 99.90
 - F1-Score_Test: 72.92
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 62.50
 - Recall_Test: 83.33
 - AUPRC_Test: 84.74
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.43
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:49:38,224] Trial 31 finished with value: 84.72200668322499 and parameters: {'learning_rate': 0.096389491823576, 'max_depth': 6, 'n_estimators': 229, 'scale_pos_weight': 12.927141498539687}. Best is trial 31 with value: 84.72200668322499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 69.03
 - Recall_Test: 84.92
 - AUPRC_Test: 85.62
 - Accuracy_Test: 99.91
 - F1-Score_Test: 76.16
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.7220

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 61.76
 - Recall_Test: 83.33
 - AUPRC_Test: 83.58
 - Accuracy_Test: 99.89
 - F1-Score_Test: 70.95
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 58.56
 - Recall_Test: 84.13
 - AUPRC_Test: 84.42
 - Accuracy_Test: 99.87
 - F1-Score_Test: 69.06
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:50:36,046] Trial 32 finished with value: 84.41202250085115 and parameters: {'learning_rate': 0.08330260705016251, 'max_depth': 6, 'n_estimators': 230, 'scale_pos_weight': 12.830432435686129}. Best is trial 31 with value: 84.72200668322499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 62.94
 - Recall_Test: 84.92
 - AUPRC_Test: 85.24
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.30
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.4120

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.92
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.96
 - F1-Score_Train: 99.96
 - Precision_Test: 41.73
 - Recall_Test: 84.13
 - AUPRC_Test: 83.02
 - Accuracy_Test: 99.78
 - F1-Score_Test: 55.79
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.85
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.92
 - F1-Score_Train: 99.92
 - Precision_Test: 36.49
 - Recall_Test: 85.71
 - AUPRC_Test: 83.40
 - Accuracy_Test: 99.72
 - F1-Score_Test: 51.18
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:51:26,327] Trial 33 finished with value: 83.83608566796094 and parameters: {'learning_rate': 0.06806205549222889, 'max_depth': 6, 'n_estimators': 195, 'scale_pos_weight': 13.761935596255714}. Best is trial 31 with value: 84.72200668322499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.91
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.95
 - F1-Score_Train: 99.95
 - Precision_Test: 45.38
 - Recall_Test: 85.71
 - AUPRC_Test: 85.09
 - Accuracy_Test: 99.80
 - F1-Score_Test: 59.34
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.8361

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 69.54
 - Recall_Test: 83.33
 - AUPRC_Test: 83.53
 - Accuracy_Test: 99.91
 - F1-Score_Test: 75.81
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 64.42
 - Recall_Test: 83.33
 - AUPRC_Test: 85.11
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.66
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:52:34,256] Trial 34 finished with value: 84.5850001188499 and parameters: {'learning_rate': 0.09757508786165124, 'max_depth': 6, 'n_estimators': 255, 'scale_pos_weight': 14.256821641418599}. Best is trial 31 with value: 84.72200668322499.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 71.33
 - Recall_Test: 84.92
 - AUPRC_Test: 85.12
 - Accuracy_Test: 99.92
 - F1-Score_Test: 77.54
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.5850

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 67.97
 - Recall_Test: 82.54
 - AUPRC_Test: 83.65
 - Accuracy_Test: 99.91
 - F1-Score_Test: 74.55
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 64.60
 - Recall_Test: 82.54
 - AUPRC_Test: 85.21
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.47
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:53:40,084] Trial 35 finished with value: 84.80254817584363 and parameters: {'learning_rate': 0.09769777927597675, 'max_depth': 6, 'n_estimators': 255, 'scale_pos_weight': 14.218608915998557}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 71.33
 - Recall_Test: 84.92
 - AUPRC_Test: 85.54
 - Accuracy_Test: 99.92
 - F1-Score_Test: 77.54
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.8025

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.35
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.16
 - F1-Score_Train: 99.17
 - Precision_Test: 7.93
 - Recall_Test: 88.89
 - AUPRC_Test: 80.26
 - Accuracy_Test: 98.25
 - F1-Score_Test: 14.56
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.57
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 98.75
 - F1-Score_Train: 98.77
 - Precision_Test: 5.95
 - Recall_Test: 92.86
 - AUPRC_Test: 75.82
 - Accuracy_Test: 97.52
 - F1-Score_Test: 11.19
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:54:17,981] Trial 36 finished with value: 77.94933183673454 and parameters: {'learning_rate': 0.06049671902357879, 'max_depth': 3, 'n_estimators': 278, 'scale_pos_weight': 12.929367685034475}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.06
 - F1-Score_Train: 99.07
 - Precision_Test: 7.17
 - Recall_Test: 87.30
 - AUPRC_Test: 77.78
 - Accuracy_Test: 98.08
 - F1-Score_Test: 13.25
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 77.9493

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 63.64
 - Recall_Test: 83.33
 - AUPRC_Test: 84.06
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.16
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 57.30
 - Recall_Test: 84.13
 - AUPRC_Test: 84.61
 - Accuracy_Test: 99.87
 - F1-Score_Test: 68.17
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:55:18,757] Trial 37 finished with value: 84.57824750440376 and parameters: {'learning_rate': 0.09930709483939626, 'max_depth': 5, 'n_estimators': 281, 'scale_pos_weight': 14.998740583026917}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 62.21
 - Recall_Test: 84.92
 - AUPRC_Test: 85.07
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.81
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.5782

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.91
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.96
 - F1-Score_Train: 99.96
 - Precision_Test: 41.96
 - Recall_Test: 84.92
 - AUPRC_Test: 82.65
 - Accuracy_Test: 99.78
 - F1-Score_Test: 56.17
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.85
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.92
 - F1-Score_Train: 99.92
 - Precision_Test: 36.73
 - Recall_Test: 85.71
 - AUPRC_Test: 83.70
 - Accuracy_Test: 99.73
 - F1-Score_Test: 51.43
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:56:26,345] Trial 38 finished with value: 83.91690101454749 and parameters: {'learning_rate': 0.0517794151280147, 'max_depth': 6, 'n_estimators': 251, 'scale_pos_weight': 12.526156465651162}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.90
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.95
 - F1-Score_Train: 99.95
 - Precision_Test: 46.78
 - Recall_Test: 86.51
 - AUPRC_Test: 85.41
 - Accuracy_Test: 99.81
 - F1-Score_Test: 60.72
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.9169

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 56.38
 - Recall_Test: 84.13
 - AUPRC_Test: 83.47
 - Accuracy_Test: 99.86
 - F1-Score_Test: 67.52
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.94
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.97
 - F1-Score_Train: 99.97
 - Precision_Test: 49.52
 - Recall_Test: 82.54
 - AUPRC_Test: 84.27
 - Accuracy_Test: 99.83
 - F1-Score_Test: 61.90
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:57:26,387] Trial 39 finished with value: 84.2293339619397 and parameters: {'learning_rate': 0.07406440186743024, 'max_depth': 6, 'n_estimators': 235, 'scale_pos_weight': 14.10472687758145}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 59.12
 - Recall_Test: 84.92
 - AUPRC_Test: 84.94
 - Accuracy_Test: 99.88
 - F1-Score_Test: 69.71
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.2293

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 57.69
 - Recall_Test: 83.33
 - AUPRC_Test: 82.84
 - Accuracy_Test: 99.87
 - F1-Score_Test: 68.18
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 53.85
 - Recall_Test: 83.33
 - AUPRC_Test: 84.01
 - Accuracy_Test: 99.85
 - F1-Score_Test: 65.42
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:58:19,937] Trial 40 finished with value: 83.87110040393095 and parameters: {'learning_rate': 0.08788926803943453, 'max_depth': 6, 'n_estimators': 207, 'scale_pos_weight': 11.754962062829808}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 62.57
 - Recall_Test: 84.92
 - AUPRC_Test: 84.77
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.05
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.8711

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 100.00
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 65.82
 - Recall_Test: 82.54
 - AUPRC_Test: 83.28
 - Accuracy_Test: 99.90
 - F1-Score_Test: 73.24
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 62.72
 - Recall_Test: 84.13
 - AUPRC_Test: 85.11
 - Accuracy_Test: 99.89
 - F1-Score_Test: 71.86
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 13:59:23,740] Trial 41 finished with value: 84.61044405043434 and parameters: {'learning_rate': 0.08766984736408615, 'max_depth': 6, 'n_estimators': 253, 'scale_pos_weight': 14.129554192938574}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 100.00
 - F1-Score_Train: 100.00
 - Precision_Test: 65.64
 - Recall_Test: 84.92
 - AUPRC_Test: 85.44
 - Accuracy_Test: 99.90
 - F1-Score_Test: 74.05
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.6104

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.99
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 61.05
 - Recall_Test: 83.33
 - AUPRC_Test: 83.13
 - Accuracy_Test: 99.88
 - F1-Score_Test: 70.47
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 55.50
 - Recall_Test: 84.13
 - AUPRC_Test: 85.24
 - Accuracy_Test: 99.86
 - F1-Score_Test: 66.88
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:00:31,532] Trial 42 finished with value: 84.70247435489021 and parameters: {'learning_rate': 0.07268320017950446, 'max_depth': 6, 'n_estimators': 259, 'scale_pos_weight': 13.808745386482586}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 64.85
 - Recall_Test: 84.92
 - AUPRC_Test: 85.73
 - Accuracy_Test: 99.90
 - F1-Score_Test: 73.54
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.7025

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 57.07
 - Recall_Test: 83.33
 - AUPRC_Test: 83.67
 - Accuracy_Test: 99.87
 - F1-Score_Test: 67.74
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.95
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.97
 - F1-Score_Train: 99.97
 - Precision_Test: 49.77
 - Recall_Test: 85.71
 - AUPRC_Test: 84.74
 - Accuracy_Test: 99.83
 - F1-Score_Test: 62.97
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:01:44,808] Trial 43 finished with value: 84.3940267654455 and parameters: {'learning_rate': 0.06115769628397858, 'max_depth': 6, 'n_estimators': 274, 'scale_pos_weight': 12.97679547241286}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 56.54
 - Recall_Test: 85.71
 - AUPRC_Test: 84.78
 - Accuracy_Test: 99.87
 - F1-Score_Test: 68.14
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.3940

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.95
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 51.46
 - Recall_Test: 84.13
 - AUPRC_Test: 83.08
 - Accuracy_Test: 99.84
 - F1-Score_Test: 63.86
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.94
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.97
 - F1-Score_Train: 99.97
 - Precision_Test: 46.72
 - Recall_Test: 84.92
 - AUPRC_Test: 84.33
 - Accuracy_Test: 99.81
 - F1-Score_Test: 60.28
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:02:41,489] Trial 44 finished with value: 84.24202608093834 and parameters: {'learning_rate': 0.07382979755866753, 'max_depth': 6, 'n_estimators': 214, 'scale_pos_weight': 12.486802872834668}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 56.25
 - Recall_Test: 85.71
 - AUPRC_Test: 85.32
 - Accuracy_Test: 99.86
 - F1-Score_Test: 67.92
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.2420

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.61
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.80
 - F1-Score_Train: 99.80
 - Precision_Test: 21.78
 - Recall_Test: 87.30
 - AUPRC_Test: 81.10
 - Accuracy_Test: 99.45
 - F1-Score_Test: 34.87
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.24
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.62
 - F1-Score_Train: 99.62
 - Precision_Test: 14.91
 - Recall_Test: 89.68
 - AUPRC_Test: 80.62
 - Accuracy_Test: 99.12
 - F1-Score_Test: 25.57
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:03:28,834] Trial 45 finished with value: 81.84253600082046 and parameters: {'learning_rate': 0.06800639120007515, 'max_depth': 4, 'n_estimators': 255, 'scale_pos_weight': 13.72211105327384}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.43
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.71
 - F1-Score_Train: 99.71
 - Precision_Test: 18.21
 - Recall_Test: 87.30
 - AUPRC_Test: 83.81
 - Accuracy_Test: 99.32
 - F1-Score_Test: 30.14
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 81.8425

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.91
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.95
 - F1-Score_Train: 99.95
 - Precision_Test: 43.09
 - Recall_Test: 84.13
 - AUPRC_Test: 82.89
 - Accuracy_Test: 99.79
 - F1-Score_Test: 56.99
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.83
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.92
 - F1-Score_Train: 99.92
 - Precision_Test: 34.84
 - Recall_Test: 85.71
 - AUPRC_Test: 83.79
 - Accuracy_Test: 99.71
 - F1-Score_Test: 49.54
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:04:13,718] Trial 46 finished with value: 83.74850152967751 and parameters: {'learning_rate': 0.09032390378027735, 'max_depth': 5, 'n_estimators': 195, 'scale_pos_weight': 12.06945089918033}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.87
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.93
 - F1-Score_Train: 99.93
 - Precision_Test: 40.38
 - Recall_Test: 84.92
 - AUPRC_Test: 84.57
 - Accuracy_Test: 99.76
 - F1-Score_Test: 54.73
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 83.7485

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 57.38
 - Recall_Test: 83.33
 - AUPRC_Test: 82.15
 - Accuracy_Test: 99.87
 - F1-Score_Test: 67.96
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 54.08
 - Recall_Test: 84.13
 - AUPRC_Test: 84.51
 - Accuracy_Test: 99.85
 - F1-Score_Test: 65.84
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:05:11,505] Trial 47 finished with value: 84.02640705477707 and parameters: {'learning_rate': 0.07820734925275324, 'max_depth': 6, 'n_estimators': 224, 'scale_pos_weight': 11.07498033580423}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 62.79
 - Recall_Test: 85.71
 - AUPRC_Test: 85.41
 - Accuracy_Test: 99.89
 - F1-Score_Test: 72.48
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.0264

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.98
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 59.09
 - Recall_Test: 82.54
 - AUPRC_Test: 83.31
 - Accuracy_Test: 99.87
 - F1-Score_Test: 68.87
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.96
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.98
 - F1-Score_Train: 99.98
 - Precision_Test: 53.89
 - Recall_Test: 82.54
 - AUPRC_Test: 83.88
 - Accuracy_Test: 99.85
 - F1-Score_Test: 65.20
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:06:11,631] Trial 48 finished with value: 84.14956191958008 and parameters: {'learning_rate': 0.0999907772460011, 'max_depth': 5, 'n_estimators': 243, 'scale_pos_weight': 10.3224165372168}. Best is trial 35 with value: 84.80254817584363.
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.97
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.99
 - F1-Score_Train: 99.99
 - Precision_Test: 59.78
 - Recall_Test: 84.92
 - AUPRC_Test: 85.26
 - Accuracy_Test: 99.88
 - F1-Score_Test: 70.16
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 84.1496

🔍 Optimizando hiperparámetros para XGBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.24
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.58
 - F1-Score_Train: 98.60
 - Precision_Test: 4.60
 - Recall_Test: 87.30
 - AUPRC_Test: 74.48
 - Accuracy_Test: 96.93
 - F1-Score_Test: 8.74
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (1, 22)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...

✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.75
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 97.78
 - F1-Score_Train: 97.83
 - Precision_Test: 3.36
 - Recall_Test: 93.65
 - AUPRC_Test: 70.71
 - Accuracy_Test: 95.46
 - F1-Score_Test: 6.49
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (2, 22)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando XGBoost (Optuna con SMOTE)...
[I 2024-12-19 14:06:56,848] Trial 49 finished with value: 73.32809020739643 and parameters: {'learning_rate': 0.016815478978109078, 'max_depth': 6, 'n_estimators': 177, 'scale_pos_weight': 14.611683058636316}. Best is trial 35 with value: 84.80254817584363.
[I 2024-12-19 14:06:56,852] A new study created in memory with name: no-name-7560ed73-8ab9-4145-a9fd-63e8e7794e9c
✅ Resultados para XGBoost (Optuna con SMOTE):
 - Modelo: XGBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.47
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.17
 - F1-Score_Train: 98.20
 - Precision_Test: 3.92
 - Recall_Test: 92.06
 - AUPRC_Test: 74.80
 - Accuracy_Test: 96.19
 - F1-Score_Test: 7.51
 - objective: None
 - kwargs: None
✅ Tamaño del DataFrame actualizado: (3, 22)

🏆 Promedio de AUPRC en validación cruzada: 73.3281

🚀 Optimización de Hiperparámetros con Optuna para CatBoost...

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4850805	total: 163ms	remaining: 22.7s
1:	learn: 0.3488096	total: 303ms	remaining: 20.9s
2:	learn: 0.2702077	total: 417ms	remaining: 19s
3:	learn: 0.2051009	total: 556ms	remaining: 18.9s
4:	learn: 0.1659154	total: 713ms	remaining: 19.2s
5:	learn: 0.1410731	total: 845ms	remaining: 18.9s
6:	learn: 0.1234446	total: 987ms	remaining: 18.8s
7:	learn: 0.1127425	total: 1.13s	remaining: 18.6s
8:	learn: 0.1008716	total: 1.28s	remaining: 18.6s
9:	learn: 0.0953625	total: 1.41s	remaining: 18.3s
10:	learn: 0.0894625	total: 1.55s	remaining: 18.2s
11:	learn: 0.0830212	total: 1.68s	remaining: 17.9s
12:	learn: 0.0796139	total: 1.81s	remaining: 17.7s
13:	learn: 0.0751382	total: 1.95s	remaining: 17.5s
14:	learn: 0.0716402	total: 2.03s	remaining: 16.9s
15:	learn: 0.0677339	total: 2.1s	remaining: 16.3s
16:	learn: 0.0639133	total: 2.17s	remaining: 15.7s
17:	learn: 0.0606517	total: 2.24s	remaining: 15.2s
18:	learn: 0.0588882	total: 2.33s	remaining: 14.8s
19:	learn: 0.0566420	total: 2.4s	remaining: 14.4s
20:	learn: 0.0545771	total: 2.47s	remaining: 14s
21:	learn: 0.0527613	total: 2.54s	remaining: 13.6s
22:	learn: 0.0499725	total: 2.61s	remaining: 13.3s
23:	learn: 0.0481951	total: 2.69s	remaining: 13s
24:	learn: 0.0466919	total: 2.76s	remaining: 12.7s
25:	learn: 0.0447118	total: 2.83s	remaining: 12.4s
26:	learn: 0.0437172	total: 2.92s	remaining: 12.2s
27:	learn: 0.0425169	total: 2.99s	remaining: 11.9s
28:	learn: 0.0414414	total: 3.06s	remaining: 11.7s
29:	learn: 0.0399917	total: 3.14s	remaining: 11.5s
30:	learn: 0.0391791	total: 3.21s	remaining: 11.3s
31:	learn: 0.0381293	total: 3.28s	remaining: 11.1s
32:	learn: 0.0372435	total: 3.37s	remaining: 10.9s
33:	learn: 0.0362682	total: 3.44s	remaining: 10.7s
34:	learn: 0.0355208	total: 3.52s	remaining: 10.6s
35:	learn: 0.0347820	total: 3.59s	remaining: 10.4s
36:	learn: 0.0334098	total: 3.67s	remaining: 10.2s
37:	learn: 0.0329506	total: 3.73s	remaining: 10s
38:	learn: 0.0320200	total: 3.81s	remaining: 9.87s
39:	learn: 0.0312538	total: 3.89s	remaining: 9.72s
40:	learn: 0.0304602	total: 3.96s	remaining: 9.56s
41:	learn: 0.0299295	total: 4.04s	remaining: 9.42s
42:	learn: 0.0292566	total: 4.12s	remaining: 9.29s
43:	learn: 0.0286049	total: 4.2s	remaining: 9.16s
44:	learn: 0.0282244	total: 4.27s	remaining: 9.02s
45:	learn: 0.0279841	total: 4.36s	remaining: 8.9s
46:	learn: 0.0274289	total: 4.42s	remaining: 8.74s
47:	learn: 0.0268990	total: 4.49s	remaining: 8.61s
48:	learn: 0.0264226	total: 4.56s	remaining: 8.48s
49:	learn: 0.0260244	total: 4.63s	remaining: 8.34s
50:	learn: 0.0257346	total: 4.71s	remaining: 8.21s
51:	learn: 0.0253345	total: 4.78s	remaining: 8.09s
52:	learn: 0.0250752	total: 4.85s	remaining: 7.96s
53:	learn: 0.0247632	total: 4.93s	remaining: 7.85s
54:	learn: 0.0241924	total: 5.01s	remaining: 7.74s
55:	learn: 0.0235222	total: 5.08s	remaining: 7.63s
56:	learn: 0.0231314	total: 5.17s	remaining: 7.52s
57:	learn: 0.0225106	total: 5.23s	remaining: 7.4s
58:	learn: 0.0222305	total: 5.3s	remaining: 7.28s
59:	learn: 0.0219824	total: 5.39s	remaining: 7.18s
60:	learn: 0.0216150	total: 5.46s	remaining: 7.07s
61:	learn: 0.0212674	total: 5.53s	remaining: 6.95s
62:	learn: 0.0210042	total: 5.6s	remaining: 6.85s
63:	learn: 0.0205542	total: 5.67s	remaining: 6.74s
64:	learn: 0.0200865	total: 5.75s	remaining: 6.63s
65:	learn: 0.0196970	total: 5.82s	remaining: 6.53s
66:	learn: 0.0194514	total: 5.89s	remaining: 6.42s
67:	learn: 0.0191804	total: 5.97s	remaining: 6.32s
68:	learn: 0.0187901	total: 6.04s	remaining: 6.22s
69:	learn: 0.0184833	total: 6.11s	remaining: 6.11s
70:	learn: 0.0181297	total: 6.18s	remaining: 6.01s
71:	learn: 0.0179727	total: 6.26s	remaining: 5.91s
72:	learn: 0.0177707	total: 6.33s	remaining: 5.81s
73:	learn: 0.0175155	total: 6.42s	remaining: 5.72s
74:	learn: 0.0173233	total: 6.5s	remaining: 5.63s
75:	learn: 0.0169827	total: 6.57s	remaining: 5.53s
76:	learn: 0.0168070	total: 6.64s	remaining: 5.43s
77:	learn: 0.0165508	total: 6.72s	remaining: 5.34s
78:	learn: 0.0162594	total: 6.78s	remaining: 5.24s
79:	learn: 0.0161020	total: 6.86s	remaining: 5.14s
80:	learn: 0.0158482	total: 6.94s	remaining: 5.05s
81:	learn: 0.0156590	total: 7.01s	remaining: 4.96s
82:	learn: 0.0152805	total: 7.09s	remaining: 4.87s
83:	learn: 0.0149743	total: 7.16s	remaining: 4.77s
84:	learn: 0.0144789	total: 7.23s	remaining: 4.68s
85:	learn: 0.0143251	total: 7.3s	remaining: 4.58s
86:	learn: 0.0141689	total: 7.38s	remaining: 4.49s
87:	learn: 0.0139415	total: 7.46s	remaining: 4.41s
88:	learn: 0.0137445	total: 7.53s	remaining: 4.31s
89:	learn: 0.0134932	total: 7.61s	remaining: 4.23s
90:	learn: 0.0132850	total: 7.69s	remaining: 4.14s
91:	learn: 0.0131536	total: 7.76s	remaining: 4.05s
92:	learn: 0.0130234	total: 7.83s	remaining: 3.96s
93:	learn: 0.0129071	total: 7.91s	remaining: 3.87s
94:	learn: 0.0127447	total: 7.99s	remaining: 3.78s
95:	learn: 0.0125843	total: 8.06s	remaining: 3.69s
96:	learn: 0.0124158	total: 8.13s	remaining: 3.6s
97:	learn: 0.0122905	total: 8.21s	remaining: 3.52s
98:	learn: 0.0121381	total: 8.28s	remaining: 3.43s
99:	learn: 0.0119467	total: 8.35s	remaining: 3.34s
100:	learn: 0.0118067	total: 8.42s	remaining: 3.25s
101:	learn: 0.0116918	total: 8.51s	remaining: 3.17s
102:	learn: 0.0115921	total: 8.58s	remaining: 3.08s
103:	learn: 0.0114431	total: 8.65s	remaining: 2.99s
104:	learn: 0.0112688	total: 8.73s	remaining: 2.91s
105:	learn: 0.0111884	total: 8.8s	remaining: 2.82s
106:	learn: 0.0110765	total: 8.87s	remaining: 2.73s
107:	learn: 0.0109810	total: 8.96s	remaining: 2.65s
108:	learn: 0.0109240	total: 9.03s	remaining: 2.57s
109:	learn: 0.0108685	total: 9.1s	remaining: 2.48s
110:	learn: 0.0107809	total: 9.17s	remaining: 2.39s
111:	learn: 0.0107118	total: 9.23s	remaining: 2.31s
112:	learn: 0.0106561	total: 9.31s	remaining: 2.22s
113:	learn: 0.0105172	total: 9.38s	remaining: 2.14s
114:	learn: 0.0103492	total: 9.45s	remaining: 2.05s
115:	learn: 0.0102296	total: 9.54s	remaining: 1.97s
116:	learn: 0.0101593	total: 9.61s	remaining: 1.89s
117:	learn: 0.0100332	total: 9.68s	remaining: 1.8s
118:	learn: 0.0099481	total: 9.75s	remaining: 1.72s
119:	learn: 0.0097981	total: 9.81s	remaining: 1.64s
120:	learn: 0.0096827	total: 9.89s	remaining: 1.55s
121:	learn: 0.0095929	total: 9.97s	remaining: 1.47s
122:	learn: 0.0094624	total: 10s	remaining: 1.39s
123:	learn: 0.0093891	total: 10.1s	remaining: 1.3s
124:	learn: 0.0093166	total: 10.2s	remaining: 1.22s
125:	learn: 0.0092797	total: 10.2s	remaining: 1.14s
126:	learn: 0.0092040	total: 10.3s	remaining: 1.05s
127:	learn: 0.0091140	total: 10.4s	remaining: 973ms
128:	learn: 0.0090237	total: 10.4s	remaining: 891ms
129:	learn: 0.0088925	total: 10.5s	remaining: 810ms
130:	learn: 0.0088333	total: 10.6s	remaining: 728ms
131:	learn: 0.0087621	total: 10.7s	remaining: 646ms
132:	learn: 0.0086801	total: 10.7s	remaining: 564ms
133:	learn: 0.0086072	total: 10.8s	remaining: 483ms
134:	learn: 0.0085116	total: 10.9s	remaining: 402ms
135:	learn: 0.0084636	total: 10.9s	remaining: 321ms
136:	learn: 0.0084062	total: 11s	remaining: 241ms
137:	learn: 0.0083507	total: 11.1s	remaining: 160ms
138:	learn: 0.0082313	total: 11.1s	remaining: 80.1ms
139:	learn: 0.0081490	total: 11.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.14
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.56
 - F1-Score_Train: 99.57
 - Precision_Test: 12.91
 - Recall_Test: 87.30
 - AUPRC_Test: 72.07
 - Accuracy_Test: 98.99
 - F1-Score_Test: 22.49
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 140
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4987237	total: 134ms	remaining: 18.6s
1:	learn: 0.3753171	total: 270ms	remaining: 18.6s
2:	learn: 0.2836961	total: 400ms	remaining: 18.3s
3:	learn: 0.2318818	total: 543ms	remaining: 18.5s
4:	learn: 0.1883658	total: 686ms	remaining: 18.5s
5:	learn: 0.1727098	total: 839ms	remaining: 18.7s
6:	learn: 0.1591220	total: 985ms	remaining: 18.7s
7:	learn: 0.1438954	total: 1.13s	remaining: 18.6s
8:	learn: 0.1356891	total: 1.27s	remaining: 18.5s
9:	learn: 0.1272846	total: 1.41s	remaining: 18.4s
10:	learn: 0.1184757	total: 1.57s	remaining: 18.4s
11:	learn: 0.1116083	total: 1.71s	remaining: 18.2s
12:	learn: 0.1078638	total: 1.83s	remaining: 17.9s
13:	learn: 0.1044671	total: 1.95s	remaining: 17.6s
14:	learn: 0.0997109	total: 2.11s	remaining: 17.6s
15:	learn: 0.0950893	total: 2.25s	remaining: 17.4s
16:	learn: 0.0915540	total: 2.38s	remaining: 17.3s
17:	learn: 0.0876254	total: 2.52s	remaining: 17.1s
18:	learn: 0.0844285	total: 2.67s	remaining: 17s
19:	learn: 0.0819389	total: 2.81s	remaining: 16.9s
20:	learn: 0.0794527	total: 2.95s	remaining: 16.7s
21:	learn: 0.0770595	total: 3.11s	remaining: 16.7s
22:	learn: 0.0740494	total: 3.26s	remaining: 16.6s
23:	learn: 0.0727042	total: 3.38s	remaining: 16.4s
24:	learn: 0.0697327	total: 3.49s	remaining: 16s
25:	learn: 0.0678797	total: 3.55s	remaining: 15.6s
26:	learn: 0.0659189	total: 3.63s	remaining: 15.2s
27:	learn: 0.0637184	total: 3.7s	remaining: 14.8s
28:	learn: 0.0620820	total: 3.78s	remaining: 14.5s
29:	learn: 0.0609888	total: 3.85s	remaining: 14.1s
30:	learn: 0.0596807	total: 3.93s	remaining: 13.8s
31:	learn: 0.0580981	total: 4.01s	remaining: 13.5s
32:	learn: 0.0568675	total: 4.09s	remaining: 13.3s
33:	learn: 0.0558639	total: 4.18s	remaining: 13s
34:	learn: 0.0548098	total: 4.24s	remaining: 12.7s
35:	learn: 0.0531857	total: 4.31s	remaining: 12.5s
36:	learn: 0.0520622	total: 4.39s	remaining: 12.2s
37:	learn: 0.0510625	total: 4.46s	remaining: 12s
38:	learn: 0.0503482	total: 4.52s	remaining: 11.7s
39:	learn: 0.0493148	total: 4.64s	remaining: 11.6s
40:	learn: 0.0483042	total: 4.71s	remaining: 11.4s
41:	learn: 0.0478025	total: 4.77s	remaining: 11.1s
42:	learn: 0.0471492	total: 4.84s	remaining: 10.9s
43:	learn: 0.0459148	total: 4.92s	remaining: 10.7s
44:	learn: 0.0451374	total: 5s	remaining: 10.6s
45:	learn: 0.0445196	total: 5.09s	remaining: 10.4s
46:	learn: 0.0433735	total: 5.16s	remaining: 10.2s
47:	learn: 0.0425442	total: 5.23s	remaining: 10s
48:	learn: 0.0414611	total: 5.3s	remaining: 9.84s
49:	learn: 0.0408432	total: 5.38s	remaining: 9.68s
50:	learn: 0.0397730	total: 5.44s	remaining: 9.5s
51:	learn: 0.0393296	total: 5.51s	remaining: 9.32s
52:	learn: 0.0386802	total: 5.58s	remaining: 9.15s
53:	learn: 0.0379765	total: 5.66s	remaining: 9.01s
54:	learn: 0.0372315	total: 5.72s	remaining: 8.85s
55:	learn: 0.0365265	total: 5.8s	remaining: 8.7s
56:	learn: 0.0359263	total: 5.87s	remaining: 8.55s
57:	learn: 0.0348950	total: 5.95s	remaining: 8.41s
58:	learn: 0.0345103	total: 6.01s	remaining: 8.26s
59:	learn: 0.0336378	total: 6.11s	remaining: 8.15s
60:	learn: 0.0331561	total: 6.18s	remaining: 8.01s
61:	learn: 0.0327735	total: 6.25s	remaining: 7.86s
62:	learn: 0.0322746	total: 6.32s	remaining: 7.72s
63:	learn: 0.0316324	total: 6.39s	remaining: 7.59s
64:	learn: 0.0313547	total: 6.45s	remaining: 7.44s
65:	learn: 0.0308510	total: 6.52s	remaining: 7.31s
66:	learn: 0.0302073	total: 6.59s	remaining: 7.18s
67:	learn: 0.0297776	total: 6.65s	remaining: 7.05s
68:	learn: 0.0293506	total: 6.73s	remaining: 6.93s
69:	learn: 0.0288713	total: 6.81s	remaining: 6.81s
70:	learn: 0.0285083	total: 6.88s	remaining: 6.68s
71:	learn: 0.0281217	total: 6.96s	remaining: 6.57s
72:	learn: 0.0276527	total: 7.03s	remaining: 6.45s
73:	learn: 0.0271716	total: 7.11s	remaining: 6.34s
74:	learn: 0.0269223	total: 7.18s	remaining: 6.22s
75:	learn: 0.0266848	total: 7.24s	remaining: 6.1s
76:	learn: 0.0264010	total: 7.31s	remaining: 5.98s
77:	learn: 0.0260606	total: 7.38s	remaining: 5.87s
78:	learn: 0.0256538	total: 7.46s	remaining: 5.76s
79:	learn: 0.0252792	total: 7.53s	remaining: 5.65s
80:	learn: 0.0250450	total: 7.6s	remaining: 5.54s
81:	learn: 0.0247367	total: 7.67s	remaining: 5.43s
82:	learn: 0.0244568	total: 7.74s	remaining: 5.31s
83:	learn: 0.0241241	total: 7.81s	remaining: 5.21s
84:	learn: 0.0237294	total: 7.88s	remaining: 5.1s
85:	learn: 0.0234208	total: 7.95s	remaining: 4.99s
86:	learn: 0.0230311	total: 8.03s	remaining: 4.89s
87:	learn: 0.0227377	total: 8.09s	remaining: 4.78s
88:	learn: 0.0224519	total: 8.18s	remaining: 4.69s
89:	learn: 0.0220079	total: 8.26s	remaining: 4.59s
90:	learn: 0.0218266	total: 8.32s	remaining: 4.48s
91:	learn: 0.0214140	total: 8.41s	remaining: 4.39s
92:	learn: 0.0211610	total: 8.48s	remaining: 4.28s
93:	learn: 0.0208359	total: 8.55s	remaining: 4.18s
94:	learn: 0.0204941	total: 8.61s	remaining: 4.08s
95:	learn: 0.0202830	total: 8.68s	remaining: 3.98s
96:	learn: 0.0200876	total: 8.76s	remaining: 3.88s
97:	learn: 0.0198860	total: 8.83s	remaining: 3.78s
98:	learn: 0.0195646	total: 8.9s	remaining: 3.69s
99:	learn: 0.0193419	total: 8.98s	remaining: 3.59s
100:	learn: 0.0191320	total: 9.05s	remaining: 3.5s
101:	learn: 0.0186840	total: 9.13s	remaining: 3.4s
102:	learn: 0.0185088	total: 9.21s	remaining: 3.31s
103:	learn: 0.0181775	total: 9.29s	remaining: 3.21s
104:	learn: 0.0180884	total: 9.36s	remaining: 3.12s
105:	learn: 0.0178027	total: 9.43s	remaining: 3.02s
106:	learn: 0.0175887	total: 9.5s	remaining: 2.93s
107:	learn: 0.0174049	total: 9.57s	remaining: 2.84s
108:	learn: 0.0170796	total: 9.65s	remaining: 2.74s
109:	learn: 0.0169741	total: 9.71s	remaining: 2.65s
110:	learn: 0.0167783	total: 9.79s	remaining: 2.56s
111:	learn: 0.0165614	total: 9.87s	remaining: 2.47s
112:	learn: 0.0164099	total: 9.93s	remaining: 2.37s
113:	learn: 0.0163217	total: 10s	remaining: 2.28s
114:	learn: 0.0162151	total: 10.1s	remaining: 2.19s
115:	learn: 0.0159884	total: 10.2s	remaining: 2.1s
116:	learn: 0.0157322	total: 10.2s	remaining: 2.01s
117:	learn: 0.0155539	total: 10.3s	remaining: 1.92s
118:	learn: 0.0153909	total: 10.4s	remaining: 1.83s
119:	learn: 0.0151638	total: 10.4s	remaining: 1.74s
120:	learn: 0.0151042	total: 10.5s	remaining: 1.65s
121:	learn: 0.0148836	total: 10.6s	remaining: 1.56s
122:	learn: 0.0147784	total: 10.7s	remaining: 1.47s
123:	learn: 0.0146955	total: 10.7s	remaining: 1.38s
124:	learn: 0.0145289	total: 10.8s	remaining: 1.29s
125:	learn: 0.0143643	total: 10.9s	remaining: 1.21s
126:	learn: 0.0142227	total: 10.9s	remaining: 1.12s
127:	learn: 0.0140333	total: 11s	remaining: 1.03s
128:	learn: 0.0139484	total: 11.1s	remaining: 944ms
129:	learn: 0.0137801	total: 11.1s	remaining: 856ms
130:	learn: 0.0135596	total: 11.2s	remaining: 771ms
131:	learn: 0.0134060	total: 11.3s	remaining: 685ms
132:	learn: 0.0132186	total: 11.4s	remaining: 598ms
133:	learn: 0.0130907	total: 11.4s	remaining: 512ms
134:	learn: 0.0129283	total: 11.5s	remaining: 426ms
135:	learn: 0.0128073	total: 11.6s	remaining: 341ms
136:	learn: 0.0127178	total: 11.7s	remaining: 255ms
137:	learn: 0.0125939	total: 11.7s	remaining: 170ms
138:	learn: 0.0124044	total: 11.8s	remaining: 84.8ms
139:	learn: 0.0123299	total: 11.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.75
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.37
 - F1-Score_Train: 99.37
 - Precision_Test: 10.31
 - Recall_Test: 89.68
 - AUPRC_Test: 71.41
 - Accuracy_Test: 98.67
 - F1-Score_Test: 18.49
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 140
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4990753	total: 145ms	remaining: 20.1s
1:	learn: 0.3598621	total: 294ms	remaining: 20.3s
2:	learn: 0.2725576	total: 448ms	remaining: 20.5s
3:	learn: 0.2216350	total: 593ms	remaining: 20.2s
4:	learn: 0.1886379	total: 717ms	remaining: 19.4s
5:	learn: 0.1638921	total: 858ms	remaining: 19.2s
6:	learn: 0.1474900	total: 988ms	remaining: 18.8s
7:	learn: 0.1325970	total: 1.13s	remaining: 18.6s
8:	learn: 0.1201391	total: 1.25s	remaining: 18.2s
9:	learn: 0.1087000	total: 1.4s	remaining: 18.1s
10:	learn: 0.1024639	total: 1.54s	remaining: 18.1s
11:	learn: 0.0966265	total: 1.68s	remaining: 17.9s
12:	learn: 0.0901089	total: 1.83s	remaining: 17.9s
13:	learn: 0.0843599	total: 1.98s	remaining: 17.8s
14:	learn: 0.0820328	total: 2.11s	remaining: 17.6s
15:	learn: 0.0782767	total: 2.24s	remaining: 17.4s
16:	learn: 0.0756343	total: 2.41s	remaining: 17.5s
17:	learn: 0.0714836	total: 2.56s	remaining: 17.4s
18:	learn: 0.0694645	total: 2.71s	remaining: 17.3s
19:	learn: 0.0661908	total: 2.85s	remaining: 17.1s
20:	learn: 0.0649913	total: 3s	remaining: 17s
21:	learn: 0.0630423	total: 3.09s	remaining: 16.6s
22:	learn: 0.0611977	total: 3.23s	remaining: 16.4s
23:	learn: 0.0597007	total: 3.37s	remaining: 16.3s
24:	learn: 0.0587095	total: 3.52s	remaining: 16.2s
25:	learn: 0.0567303	total: 3.66s	remaining: 16s
26:	learn: 0.0558621	total: 3.8s	remaining: 15.9s
27:	learn: 0.0549050	total: 3.94s	remaining: 15.8s
28:	learn: 0.0533599	total: 4.09s	remaining: 15.7s
29:	learn: 0.0520100	total: 4.2s	remaining: 15.4s
30:	learn: 0.0505241	total: 4.33s	remaining: 15.2s
31:	learn: 0.0489797	total: 4.48s	remaining: 15.1s
32:	learn: 0.0483022	total: 4.61s	remaining: 15s
33:	learn: 0.0473779	total: 4.73s	remaining: 14.8s
34:	learn: 0.0466419	total: 4.84s	remaining: 14.5s
35:	learn: 0.0454381	total: 4.93s	remaining: 14.2s
36:	learn: 0.0448911	total: 4.99s	remaining: 13.9s
37:	learn: 0.0437806	total: 5.07s	remaining: 13.6s
38:	learn: 0.0426246	total: 5.14s	remaining: 13.3s
39:	learn: 0.0421402	total: 5.21s	remaining: 13s
40:	learn: 0.0412051	total: 5.29s	remaining: 12.8s
41:	learn: 0.0404464	total: 5.35s	remaining: 12.5s
42:	learn: 0.0399431	total: 5.42s	remaining: 12.2s
43:	learn: 0.0389327	total: 5.49s	remaining: 12s
44:	learn: 0.0380351	total: 5.59s	remaining: 11.8s
45:	learn: 0.0375312	total: 5.65s	remaining: 11.5s
46:	learn: 0.0367863	total: 5.73s	remaining: 11.3s
47:	learn: 0.0362061	total: 5.79s	remaining: 11.1s
48:	learn: 0.0355636	total: 5.87s	remaining: 10.9s
49:	learn: 0.0348687	total: 5.97s	remaining: 10.7s
50:	learn: 0.0344720	total: 6.04s	remaining: 10.5s
51:	learn: 0.0337991	total: 6.11s	remaining: 10.3s
52:	learn: 0.0333795	total: 6.18s	remaining: 10.1s
53:	learn: 0.0327889	total: 6.26s	remaining: 9.96s
54:	learn: 0.0322837	total: 6.32s	remaining: 9.77s
55:	learn: 0.0316425	total: 6.39s	remaining: 9.59s
56:	learn: 0.0311417	total: 6.46s	remaining: 9.41s
57:	learn: 0.0306424	total: 6.53s	remaining: 9.23s
58:	learn: 0.0302400	total: 6.61s	remaining: 9.08s
59:	learn: 0.0297586	total: 6.68s	remaining: 8.91s
60:	learn: 0.0293921	total: 6.75s	remaining: 8.74s
61:	learn: 0.0287736	total: 6.82s	remaining: 8.59s
62:	learn: 0.0284401	total: 6.92s	remaining: 8.46s
63:	learn: 0.0280087	total: 6.99s	remaining: 8.3s
64:	learn: 0.0275771	total: 7.06s	remaining: 8.14s
65:	learn: 0.0273664	total: 7.12s	remaining: 7.98s
66:	learn: 0.0268412	total: 7.21s	remaining: 7.85s
67:	learn: 0.0265406	total: 7.28s	remaining: 7.71s
68:	learn: 0.0262225	total: 7.35s	remaining: 7.56s
69:	learn: 0.0257453	total: 7.42s	remaining: 7.42s
70:	learn: 0.0255357	total: 7.5s	remaining: 7.29s
71:	learn: 0.0251906	total: 7.57s	remaining: 7.14s
72:	learn: 0.0249035	total: 7.64s	remaining: 7.01s
73:	learn: 0.0245449	total: 7.72s	remaining: 6.89s
74:	learn: 0.0242298	total: 7.79s	remaining: 6.75s
75:	learn: 0.0238299	total: 7.87s	remaining: 6.63s
76:	learn: 0.0235834	total: 7.95s	remaining: 6.5s
77:	learn: 0.0232519	total: 8.01s	remaining: 6.37s
78:	learn: 0.0227920	total: 8.09s	remaining: 6.25s
79:	learn: 0.0225827	total: 8.17s	remaining: 6.13s
80:	learn: 0.0223645	total: 8.23s	remaining: 5.99s
81:	learn: 0.0221584	total: 8.3s	remaining: 5.87s
82:	learn: 0.0215600	total: 8.36s	remaining: 5.74s
83:	learn: 0.0212101	total: 8.44s	remaining: 5.63s
84:	learn: 0.0208622	total: 8.52s	remaining: 5.51s
85:	learn: 0.0204689	total: 8.6s	remaining: 5.4s
86:	learn: 0.0202900	total: 8.69s	remaining: 5.29s
87:	learn: 0.0198823	total: 8.76s	remaining: 5.17s
88:	learn: 0.0195919	total: 8.82s	remaining: 5.05s
89:	learn: 0.0193821	total: 8.91s	remaining: 4.95s
90:	learn: 0.0191955	total: 8.97s	remaining: 4.83s
91:	learn: 0.0190101	total: 9.04s	remaining: 4.72s
92:	learn: 0.0187025	total: 9.11s	remaining: 4.61s
93:	learn: 0.0184179	total: 9.19s	remaining: 4.5s
94:	learn: 0.0182217	total: 9.27s	remaining: 4.39s
95:	learn: 0.0179866	total: 9.34s	remaining: 4.28s
96:	learn: 0.0178213	total: 9.41s	remaining: 4.17s
97:	learn: 0.0175582	total: 9.48s	remaining: 4.06s
98:	learn: 0.0174209	total: 9.56s	remaining: 3.96s
99:	learn: 0.0172379	total: 9.65s	remaining: 3.86s
100:	learn: 0.0169514	total: 9.72s	remaining: 3.75s
101:	learn: 0.0167118	total: 9.79s	remaining: 3.65s
102:	learn: 0.0165791	total: 9.87s	remaining: 3.54s
103:	learn: 0.0163236	total: 9.94s	remaining: 3.44s
104:	learn: 0.0161343	total: 10s	remaining: 3.34s
105:	learn: 0.0159892	total: 10.1s	remaining: 3.23s
106:	learn: 0.0157934	total: 10.2s	remaining: 3.13s
107:	learn: 0.0155696	total: 10.2s	remaining: 3.03s
108:	learn: 0.0153620	total: 10.3s	remaining: 2.93s
109:	learn: 0.0152476	total: 10.4s	remaining: 2.83s
110:	learn: 0.0150857	total: 10.5s	remaining: 2.73s
111:	learn: 0.0148126	total: 10.5s	remaining: 2.63s
112:	learn: 0.0145182	total: 10.6s	remaining: 2.53s
113:	learn: 0.0142596	total: 10.7s	remaining: 2.44s
114:	learn: 0.0141187	total: 10.8s	remaining: 2.34s
115:	learn: 0.0139523	total: 10.8s	remaining: 2.24s
116:	learn: 0.0138687	total: 10.9s	remaining: 2.14s
117:	learn: 0.0136317	total: 11s	remaining: 2.05s
118:	learn: 0.0134789	total: 11s	remaining: 1.95s
119:	learn: 0.0133374	total: 11.1s	remaining: 1.85s
120:	learn: 0.0131597	total: 11.2s	remaining: 1.76s
121:	learn: 0.0129641	total: 11.3s	remaining: 1.66s
122:	learn: 0.0128705	total: 11.3s	remaining: 1.56s
123:	learn: 0.0126799	total: 11.4s	remaining: 1.47s
124:	learn: 0.0125079	total: 11.5s	remaining: 1.38s
125:	learn: 0.0124386	total: 11.5s	remaining: 1.28s
126:	learn: 0.0123086	total: 11.6s	remaining: 1.19s
127:	learn: 0.0122181	total: 11.7s	remaining: 1.1s
128:	learn: 0.0121086	total: 11.8s	remaining: 1s
129:	learn: 0.0119636	total: 11.8s	remaining: 911ms
130:	learn: 0.0118278	total: 11.9s	remaining: 819ms
131:	learn: 0.0116573	total: 12s	remaining: 727ms
132:	learn: 0.0115632	total: 12.1s	remaining: 635ms
133:	learn: 0.0114583	total: 12.1s	remaining: 543ms
134:	learn: 0.0113780	total: 12.2s	remaining: 452ms
135:	learn: 0.0113085	total: 12.3s	remaining: 361ms
136:	learn: 0.0111680	total: 12.3s	remaining: 270ms
137:	learn: 0.0109997	total: 12.4s	remaining: 180ms
138:	learn: 0.0109224	total: 12.5s	remaining: 89.8ms
139:	learn: 0.0107939	total: 12.6s	remaining: 0us
[I 2024-12-19 14:07:40,213] Trial 0 finished with value: 71.43498362547278 and parameters: {'learning_rate': 0.09972096837965244, 'max_depth': 4, 'n_estimators': 140, 'scale_pos_weight': 5.488005039321137}. Best is trial 0 with value: 71.43498362547278.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.80
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.39
 - F1-Score_Train: 99.40
 - Precision_Test: 10.49
 - Recall_Test: 88.10
 - AUPRC_Test: 70.83
 - Accuracy_Test: 98.72
 - F1-Score_Test: 18.75
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 140
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 71.4350

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6409413	total: 74.2ms	remaining: 16.5s
1:	learn: 0.5917553	total: 145ms	remaining: 16.1s
2:	learn: 0.5477579	total: 263ms	remaining: 19.4s
3:	learn: 0.5033319	total: 399ms	remaining: 21.9s
4:	learn: 0.4704481	total: 535ms	remaining: 23.4s
5:	learn: 0.4348400	total: 668ms	remaining: 24.3s
6:	learn: 0.3999592	total: 807ms	remaining: 25s
7:	learn: 0.3683739	total: 959ms	remaining: 25.9s
8:	learn: 0.3433553	total: 1.1s	remaining: 26.4s
9:	learn: 0.3218028	total: 1.28s	remaining: 27.4s
10:	learn: 0.3029301	total: 1.43s	remaining: 27.7s
11:	learn: 0.2851862	total: 1.59s	remaining: 28s
12:	learn: 0.2650233	total: 1.73s	remaining: 28.1s
13:	learn: 0.2485423	total: 1.86s	remaining: 28s
14:	learn: 0.2365585	total: 1.98s	remaining: 27.6s
15:	learn: 0.2230289	total: 2.12s	remaining: 27.6s
16:	learn: 0.2092833	total: 2.27s	remaining: 27.7s
17:	learn: 0.1977138	total: 2.42s	remaining: 27.7s
18:	learn: 0.1864668	total: 2.56s	remaining: 27.7s
19:	learn: 0.1754843	total: 2.71s	remaining: 27.6s
20:	learn: 0.1708282	total: 2.85s	remaining: 27.6s
21:	learn: 0.1632271	total: 3s	remaining: 27.5s
22:	learn: 0.1594789	total: 3.14s	remaining: 27.5s
23:	learn: 0.1543860	total: 3.3s	remaining: 27.5s
24:	learn: 0.1470805	total: 3.44s	remaining: 27.4s
25:	learn: 0.1402376	total: 3.57s	remaining: 27.2s
26:	learn: 0.1340285	total: 3.71s	remaining: 27.1s
27:	learn: 0.1309961	total: 3.85s	remaining: 26.9s
28:	learn: 0.1281945	total: 3.98s	remaining: 26.8s
29:	learn: 0.1251749	total: 4.13s	remaining: 26.7s
30:	learn: 0.1205465	total: 4.29s	remaining: 26.7s
31:	learn: 0.1178080	total: 4.42s	remaining: 26.5s
32:	learn: 0.1134914	total: 4.55s	remaining: 26.3s
33:	learn: 0.1104017	total: 4.7s	remaining: 26.3s
34:	learn: 0.1080822	total: 4.83s	remaining: 26.1s
35:	learn: 0.1055402	total: 4.9s	remaining: 25.6s
36:	learn: 0.1021753	total: 4.98s	remaining: 25.2s
37:	learn: 0.1001299	total: 5.05s	remaining: 24.7s
38:	learn: 0.0983400	total: 5.13s	remaining: 24.4s
39:	learn: 0.0952667	total: 5.21s	remaining: 24s
40:	learn: 0.0932917	total: 5.29s	remaining: 23.6s
41:	learn: 0.0912593	total: 5.36s	remaining: 23.2s
42:	learn: 0.0898000	total: 5.44s	remaining: 22.9s
43:	learn: 0.0872841	total: 5.51s	remaining: 22.6s
44:	learn: 0.0848884	total: 5.58s	remaining: 22.2s
45:	learn: 0.0834960	total: 5.65s	remaining: 21.9s
46:	learn: 0.0824378	total: 5.73s	remaining: 21.6s
47:	learn: 0.0816036	total: 5.81s	remaining: 21.3s
48:	learn: 0.0797275	total: 5.9s	remaining: 21.1s
49:	learn: 0.0789149	total: 5.97s	remaining: 20.8s
50:	learn: 0.0775700	total: 6.05s	remaining: 20.5s
51:	learn: 0.0755570	total: 6.12s	remaining: 20.3s
52:	learn: 0.0744029	total: 6.19s	remaining: 20s
53:	learn: 0.0733680	total: 6.28s	remaining: 19.8s
54:	learn: 0.0720419	total: 6.35s	remaining: 19.5s
55:	learn: 0.0708508	total: 6.43s	remaining: 19.3s
56:	learn: 0.0700218	total: 6.51s	remaining: 19.1s
57:	learn: 0.0690420	total: 6.59s	remaining: 18.9s
58:	learn: 0.0682777	total: 6.66s	remaining: 18.6s
59:	learn: 0.0671070	total: 6.74s	remaining: 18.4s
60:	learn: 0.0662909	total: 6.82s	remaining: 18.2s
61:	learn: 0.0653452	total: 6.89s	remaining: 18s
62:	learn: 0.0648485	total: 6.96s	remaining: 17.8s
63:	learn: 0.0634934	total: 7.04s	remaining: 17.6s
64:	learn: 0.0622664	total: 7.12s	remaining: 17.4s
65:	learn: 0.0617602	total: 7.2s	remaining: 17.2s
66:	learn: 0.0611806	total: 7.27s	remaining: 17s
67:	learn: 0.0606222	total: 7.34s	remaining: 16.8s
68:	learn: 0.0595844	total: 7.42s	remaining: 16.7s
69:	learn: 0.0590459	total: 7.51s	remaining: 16.5s
70:	learn: 0.0583356	total: 7.59s	remaining: 16.4s
71:	learn: 0.0572610	total: 7.67s	remaining: 16.2s
72:	learn: 0.0565909	total: 7.74s	remaining: 16s
73:	learn: 0.0559509	total: 7.81s	remaining: 15.8s
74:	learn: 0.0553194	total: 7.87s	remaining: 15.6s
75:	learn: 0.0549432	total: 7.94s	remaining: 15.5s
76:	learn: 0.0542663	total: 8.01s	remaining: 15.3s
77:	learn: 0.0539656	total: 8.08s	remaining: 15.1s
78:	learn: 0.0535126	total: 8.15s	remaining: 15s
79:	learn: 0.0529803	total: 8.24s	remaining: 14.8s
80:	learn: 0.0519638	total: 8.32s	remaining: 14.7s
81:	learn: 0.0514928	total: 8.39s	remaining: 14.5s
82:	learn: 0.0511136	total: 8.46s	remaining: 14.4s
83:	learn: 0.0505818	total: 8.55s	remaining: 14.2s
84:	learn: 0.0501984	total: 8.63s	remaining: 14.1s
85:	learn: 0.0495258	total: 8.71s	remaining: 14s
86:	learn: 0.0490506	total: 8.79s	remaining: 13.8s
87:	learn: 0.0484534	total: 8.86s	remaining: 13.7s
88:	learn: 0.0477099	total: 8.93s	remaining: 13.5s
89:	learn: 0.0473230	total: 9.02s	remaining: 13.4s
90:	learn: 0.0467722	total: 9.09s	remaining: 13.3s
91:	learn: 0.0461215	total: 9.16s	remaining: 13.1s
92:	learn: 0.0455613	total: 9.24s	remaining: 13s
93:	learn: 0.0451283	total: 9.31s	remaining: 12.9s
94:	learn: 0.0447859	total: 9.38s	remaining: 12.7s
95:	learn: 0.0444086	total: 9.45s	remaining: 12.6s
96:	learn: 0.0439768	total: 9.54s	remaining: 12.5s
97:	learn: 0.0435507	total: 9.61s	remaining: 12.4s
98:	learn: 0.0432121	total: 9.69s	remaining: 12.2s
99:	learn: 0.0428887	total: 9.77s	remaining: 12.1s
100:	learn: 0.0426047	total: 9.84s	remaining: 12s
101:	learn: 0.0422855	total: 9.9s	remaining: 11.8s
102:	learn: 0.0419319	total: 9.97s	remaining: 11.7s
103:	learn: 0.0415735	total: 10s	remaining: 11.6s
104:	learn: 0.0412287	total: 10.1s	remaining: 11.5s
105:	learn: 0.0408662	total: 10.2s	remaining: 11.3s
106:	learn: 0.0405478	total: 10.3s	remaining: 11.2s
107:	learn: 0.0402054	total: 10.3s	remaining: 11.1s
108:	learn: 0.0398917	total: 10.4s	remaining: 11s
109:	learn: 0.0395492	total: 10.5s	remaining: 10.9s
110:	learn: 0.0392747	total: 10.6s	remaining: 10.8s
111:	learn: 0.0389404	total: 10.7s	remaining: 10.7s
112:	learn: 0.0387091	total: 10.7s	remaining: 10.5s
113:	learn: 0.0384432	total: 10.8s	remaining: 10.4s
114:	learn: 0.0381316	total: 10.9s	remaining: 10.3s
115:	learn: 0.0378725	total: 11s	remaining: 10.2s
116:	learn: 0.0375460	total: 11s	remaining: 10.1s
117:	learn: 0.0373226	total: 11.1s	remaining: 9.97s
118:	learn: 0.0371147	total: 11.2s	remaining: 9.85s
119:	learn: 0.0368935	total: 11.2s	remaining: 9.74s
120:	learn: 0.0365891	total: 11.3s	remaining: 9.63s
121:	learn: 0.0363145	total: 11.4s	remaining: 9.53s
122:	learn: 0.0361146	total: 11.5s	remaining: 9.41s
123:	learn: 0.0358532	total: 11.5s	remaining: 9.31s
124:	learn: 0.0356235	total: 11.6s	remaining: 9.21s
125:	learn: 0.0354304	total: 11.7s	remaining: 9.1s
126:	learn: 0.0351819	total: 11.8s	remaining: 8.99s
127:	learn: 0.0349327	total: 11.8s	remaining: 8.88s
128:	learn: 0.0346407	total: 11.9s	remaining: 8.79s
129:	learn: 0.0344235	total: 12s	remaining: 8.68s
130:	learn: 0.0342368	total: 12.1s	remaining: 8.57s
131:	learn: 0.0340275	total: 12.1s	remaining: 8.46s
132:	learn: 0.0337656	total: 12.2s	remaining: 8.36s
133:	learn: 0.0335458	total: 12.3s	remaining: 8.25s
134:	learn: 0.0334035	total: 12.4s	remaining: 8.15s
135:	learn: 0.0331219	total: 12.4s	remaining: 8.04s
136:	learn: 0.0329100	total: 12.5s	remaining: 7.94s
137:	learn: 0.0326763	total: 12.6s	remaining: 7.84s
138:	learn: 0.0324807	total: 12.7s	remaining: 7.75s
139:	learn: 0.0322649	total: 12.7s	remaining: 7.64s
140:	learn: 0.0320716	total: 12.8s	remaining: 7.54s
141:	learn: 0.0318934	total: 12.9s	remaining: 7.44s
142:	learn: 0.0317337	total: 13s	remaining: 7.34s
143:	learn: 0.0315606	total: 13s	remaining: 7.24s
144:	learn: 0.0313443	total: 13.1s	remaining: 7.14s
145:	learn: 0.0311354	total: 13.2s	remaining: 7.04s
146:	learn: 0.0309561	total: 13.3s	remaining: 6.95s
147:	learn: 0.0307769	total: 13.3s	remaining: 6.85s
148:	learn: 0.0305415	total: 13.4s	remaining: 6.75s
149:	learn: 0.0302967	total: 13.5s	remaining: 6.66s
150:	learn: 0.0300608	total: 13.6s	remaining: 6.56s
151:	learn: 0.0298377	total: 13.6s	remaining: 6.46s
152:	learn: 0.0297120	total: 13.7s	remaining: 6.37s
153:	learn: 0.0295852	total: 13.8s	remaining: 6.27s
154:	learn: 0.0294621	total: 13.9s	remaining: 6.17s
155:	learn: 0.0293219	total: 13.9s	remaining: 6.07s
156:	learn: 0.0291569	total: 14s	remaining: 5.98s
157:	learn: 0.0290237	total: 14.1s	remaining: 5.88s
158:	learn: 0.0288008	total: 14.1s	remaining: 5.78s
159:	learn: 0.0286742	total: 14.2s	remaining: 5.69s
160:	learn: 0.0285520	total: 14.3s	remaining: 5.59s
161:	learn: 0.0283777	total: 14.4s	remaining: 5.5s
162:	learn: 0.0281937	total: 14.4s	remaining: 5.41s
163:	learn: 0.0280297	total: 14.5s	remaining: 5.32s
164:	learn: 0.0279273	total: 14.6s	remaining: 5.22s
165:	learn: 0.0277477	total: 14.7s	remaining: 5.13s
166:	learn: 0.0275982	total: 14.8s	remaining: 5.04s
167:	learn: 0.0274863	total: 14.9s	remaining: 4.96s
168:	learn: 0.0273518	total: 15s	remaining: 4.88s
169:	learn: 0.0272117	total: 15.1s	remaining: 4.8s
170:	learn: 0.0270530	total: 15.3s	remaining: 4.74s
171:	learn: 0.0268972	total: 15.4s	remaining: 4.66s
172:	learn: 0.0267261	total: 15.6s	remaining: 4.59s
173:	learn: 0.0265140	total: 15.7s	remaining: 4.52s
174:	learn: 0.0264528	total: 15.9s	remaining: 4.44s
175:	learn: 0.0263304	total: 16s	remaining: 4.37s
176:	learn: 0.0262167	total: 16.2s	remaining: 4.29s
177:	learn: 0.0260947	total: 16.3s	remaining: 4.21s
178:	learn: 0.0259712	total: 16.4s	remaining: 4.13s
179:	learn: 0.0258482	total: 16.6s	remaining: 4.05s
180:	learn: 0.0257485	total: 16.7s	remaining: 3.97s
181:	learn: 0.0256352	total: 16.8s	remaining: 3.88s
182:	learn: 0.0255495	total: 17s	remaining: 3.8s
183:	learn: 0.0254651	total: 17.1s	remaining: 3.71s
184:	learn: 0.0253736	total: 17.2s	remaining: 3.63s
185:	learn: 0.0252808	total: 17.4s	remaining: 3.55s
186:	learn: 0.0251561	total: 17.5s	remaining: 3.46s
187:	learn: 0.0250369	total: 17.6s	remaining: 3.38s
188:	learn: 0.0249194	total: 17.8s	remaining: 3.29s
189:	learn: 0.0248404	total: 17.9s	remaining: 3.21s
190:	learn: 0.0247399	total: 18.1s	remaining: 3.12s
191:	learn: 0.0246573	total: 18.2s	remaining: 3.04s
192:	learn: 0.0245579	total: 18.4s	remaining: 2.95s
193:	learn: 0.0243851	total: 18.5s	remaining: 2.86s
194:	learn: 0.0242896	total: 18.6s	remaining: 2.77s
195:	learn: 0.0241907	total: 18.8s	remaining: 2.68s
196:	learn: 0.0240939	total: 18.9s	remaining: 2.59s
197:	learn: 0.0239854	total: 19s	remaining: 2.5s
198:	learn: 0.0238367	total: 19.2s	remaining: 2.41s
199:	learn: 0.0237423	total: 19.3s	remaining: 2.32s
200:	learn: 0.0236333	total: 19.5s	remaining: 2.23s
201:	learn: 0.0235182	total: 19.6s	remaining: 2.14s
202:	learn: 0.0234033	total: 19.8s	remaining: 2.04s
203:	learn: 0.0232239	total: 19.9s	remaining: 1.95s
204:	learn: 0.0231027	total: 20s	remaining: 1.86s
205:	learn: 0.0230032	total: 20.2s	remaining: 1.76s
206:	learn: 0.0229162	total: 20.2s	remaining: 1.66s
207:	learn: 0.0228534	total: 20.3s	remaining: 1.56s
208:	learn: 0.0227877	total: 20.4s	remaining: 1.46s
209:	learn: 0.0227166	total: 20.4s	remaining: 1.36s
210:	learn: 0.0226398	total: 20.5s	remaining: 1.26s
211:	learn: 0.0225229	total: 20.6s	remaining: 1.17s
212:	learn: 0.0224185	total: 20.7s	remaining: 1.07s
213:	learn: 0.0223183	total: 20.7s	remaining: 969ms
214:	learn: 0.0222077	total: 20.8s	remaining: 871ms
215:	learn: 0.0221543	total: 20.9s	remaining: 774ms
216:	learn: 0.0220728	total: 21s	remaining: 677ms
217:	learn: 0.0219634	total: 21.1s	remaining: 580ms
218:	learn: 0.0218636	total: 21.1s	remaining: 483ms
219:	learn: 0.0217280	total: 21.2s	remaining: 386ms
220:	learn: 0.0216533	total: 21.3s	remaining: 289ms
221:	learn: 0.0215715	total: 21.4s	remaining: 192ms
222:	learn: 0.0215099	total: 21.4s	remaining: 96.2ms
223:	learn: 0.0214158	total: 21.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.24
 - Recall_Train: 100.00
 - AUPRC_Train: 99.87
 - Accuracy_Train: 97.50
 - F1-Score_Train: 97.56
 - Precision_Test: 2.92
 - Recall_Test: 91.27
 - AUPRC_Test: 64.96
 - Accuracy_Test: 94.87
 - F1-Score_Test: 5.65
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 224
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.17
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6455346	total: 73.1ms	remaining: 16.3s
1:	learn: 0.6049447	total: 139ms	remaining: 15.4s
2:	learn: 0.5617661	total: 209ms	remaining: 15.4s
3:	learn: 0.5135602	total: 285ms	remaining: 15.7s
4:	learn: 0.4830378	total: 356ms	remaining: 15.6s
5:	learn: 0.4600783	total: 423ms	remaining: 15.4s
6:	learn: 0.4345363	total: 520ms	remaining: 16.1s
7:	learn: 0.4077903	total: 595ms	remaining: 16.1s
8:	learn: 0.3821730	total: 674ms	remaining: 16.1s
9:	learn: 0.3603296	total: 754ms	remaining: 16.1s
10:	learn: 0.3313646	total: 831ms	remaining: 16.1s
11:	learn: 0.3113971	total: 905ms	remaining: 16s
12:	learn: 0.2983157	total: 983ms	remaining: 16s
13:	learn: 0.2822759	total: 1.07s	remaining: 16s
14:	learn: 0.2671314	total: 1.14s	remaining: 15.9s
15:	learn: 0.2560661	total: 1.22s	remaining: 15.9s
16:	learn: 0.2448909	total: 1.29s	remaining: 15.7s
17:	learn: 0.2317490	total: 1.36s	remaining: 15.6s
18:	learn: 0.2211939	total: 1.44s	remaining: 15.6s
19:	learn: 0.2156056	total: 1.52s	remaining: 15.5s
20:	learn: 0.2060848	total: 1.6s	remaining: 15.5s
21:	learn: 0.1983168	total: 1.68s	remaining: 15.4s
22:	learn: 0.1942126	total: 1.75s	remaining: 15.3s
23:	learn: 0.1901256	total: 1.81s	remaining: 15.1s
24:	learn: 0.1825267	total: 1.89s	remaining: 15.1s
25:	learn: 0.1755568	total: 1.97s	remaining: 15s
26:	learn: 0.1710355	total: 2.04s	remaining: 14.9s
27:	learn: 0.1651991	total: 2.12s	remaining: 14.8s
28:	learn: 0.1617983	total: 2.19s	remaining: 14.7s
29:	learn: 0.1587510	total: 2.25s	remaining: 14.6s
30:	learn: 0.1545094	total: 2.32s	remaining: 14.5s
31:	learn: 0.1512084	total: 2.39s	remaining: 14.4s
32:	learn: 0.1472159	total: 2.46s	remaining: 14.2s
33:	learn: 0.1437263	total: 2.57s	remaining: 14.4s
34:	learn: 0.1394190	total: 2.65s	remaining: 14.3s
35:	learn: 0.1364928	total: 2.72s	remaining: 14.2s
36:	learn: 0.1330646	total: 2.8s	remaining: 14.1s
37:	learn: 0.1294406	total: 2.87s	remaining: 14s
38:	learn: 0.1266650	total: 2.94s	remaining: 13.9s
39:	learn: 0.1238824	total: 3.01s	remaining: 13.9s
40:	learn: 0.1209260	total: 3.09s	remaining: 13.8s
41:	learn: 0.1187153	total: 3.16s	remaining: 13.7s
42:	learn: 0.1171634	total: 3.23s	remaining: 13.6s
43:	learn: 0.1151684	total: 3.3s	remaining: 13.5s
44:	learn: 0.1135298	total: 3.37s	remaining: 13.4s
45:	learn: 0.1115470	total: 3.45s	remaining: 13.4s
46:	learn: 0.1099853	total: 3.52s	remaining: 13.2s
47:	learn: 0.1082120	total: 3.6s	remaining: 13.2s
48:	learn: 0.1061471	total: 3.68s	remaining: 13.1s
49:	learn: 0.1049233	total: 3.74s	remaining: 13s
50:	learn: 0.1036068	total: 3.81s	remaining: 12.9s
51:	learn: 0.1023467	total: 3.89s	remaining: 12.9s
52:	learn: 0.1011112	total: 3.96s	remaining: 12.8s
53:	learn: 0.0994144	total: 4.04s	remaining: 12.7s
54:	learn: 0.0979771	total: 4.12s	remaining: 12.7s
55:	learn: 0.0964945	total: 4.19s	remaining: 12.6s
56:	learn: 0.0953743	total: 4.26s	remaining: 12.5s
57:	learn: 0.0939335	total: 4.35s	remaining: 12.4s
58:	learn: 0.0924329	total: 4.42s	remaining: 12.4s
59:	learn: 0.0912457	total: 4.48s	remaining: 12.2s
60:	learn: 0.0903420	total: 4.56s	remaining: 12.2s
61:	learn: 0.0893177	total: 4.64s	remaining: 12.1s
62:	learn: 0.0881396	total: 4.71s	remaining: 12s
63:	learn: 0.0870271	total: 4.79s	remaining: 12s
64:	learn: 0.0861096	total: 4.85s	remaining: 11.9s
65:	learn: 0.0852059	total: 4.93s	remaining: 11.8s
66:	learn: 0.0844705	total: 5.01s	remaining: 11.7s
67:	learn: 0.0832907	total: 5.08s	remaining: 11.7s
68:	learn: 0.0825462	total: 5.15s	remaining: 11.6s
69:	learn: 0.0817745	total: 5.23s	remaining: 11.5s
70:	learn: 0.0809215	total: 5.31s	remaining: 11.4s
71:	learn: 0.0802435	total: 5.38s	remaining: 11.4s
72:	learn: 0.0796818	total: 5.45s	remaining: 11.3s
73:	learn: 0.0788977	total: 5.54s	remaining: 11.2s
74:	learn: 0.0779908	total: 5.61s	remaining: 11.1s
75:	learn: 0.0771525	total: 5.7s	remaining: 11.1s
76:	learn: 0.0763659	total: 5.77s	remaining: 11s
77:	learn: 0.0755938	total: 5.85s	remaining: 11s
78:	learn: 0.0751557	total: 5.92s	remaining: 10.9s
79:	learn: 0.0742577	total: 6s	remaining: 10.8s
80:	learn: 0.0736872	total: 6.08s	remaining: 10.7s
81:	learn: 0.0731046	total: 6.15s	remaining: 10.6s
82:	learn: 0.0724622	total: 6.23s	remaining: 10.6s
83:	learn: 0.0718157	total: 6.29s	remaining: 10.5s
84:	learn: 0.0712468	total: 6.37s	remaining: 10.4s
85:	learn: 0.0705821	total: 6.44s	remaining: 10.3s
86:	learn: 0.0698234	total: 6.52s	remaining: 10.3s
87:	learn: 0.0692818	total: 6.65s	remaining: 10.3s
88:	learn: 0.0686974	total: 6.75s	remaining: 10.2s
89:	learn: 0.0679843	total: 6.9s	remaining: 10.3s
90:	learn: 0.0675807	total: 7.02s	remaining: 10.3s
91:	learn: 0.0669771	total: 7.17s	remaining: 10.3s
92:	learn: 0.0662238	total: 7.3s	remaining: 10.3s
93:	learn: 0.0657036	total: 7.45s	remaining: 10.3s
94:	learn: 0.0653325	total: 7.57s	remaining: 10.3s
95:	learn: 0.0648424	total: 7.7s	remaining: 10.3s
96:	learn: 0.0644405	total: 7.81s	remaining: 10.2s
97:	learn: 0.0638788	total: 7.95s	remaining: 10.2s
98:	learn: 0.0635152	total: 8.1s	remaining: 10.2s
99:	learn: 0.0630577	total: 8.25s	remaining: 10.2s
100:	learn: 0.0627230	total: 8.37s	remaining: 10.2s
101:	learn: 0.0623247	total: 8.51s	remaining: 10.2s
102:	learn: 0.0618643	total: 8.66s	remaining: 10.2s
103:	learn: 0.0615078	total: 8.81s	remaining: 10.2s
104:	learn: 0.0610894	total: 8.95s	remaining: 10.1s
105:	learn: 0.0604406	total: 9.08s	remaining: 10.1s
106:	learn: 0.0601354	total: 9.21s	remaining: 10.1s
107:	learn: 0.0596050	total: 9.36s	remaining: 10.1s
108:	learn: 0.0591942	total: 9.51s	remaining: 10s
109:	learn: 0.0588419	total: 9.65s	remaining: 10s
110:	learn: 0.0584514	total: 9.79s	remaining: 9.97s
111:	learn: 0.0580501	total: 9.94s	remaining: 9.94s
112:	learn: 0.0577707	total: 10.1s	remaining: 9.91s
113:	learn: 0.0574608	total: 10.2s	remaining: 9.88s
114:	learn: 0.0569449	total: 10.4s	remaining: 9.83s
115:	learn: 0.0565509	total: 10.5s	remaining: 9.79s
116:	learn: 0.0563192	total: 10.7s	remaining: 9.74s
117:	learn: 0.0560319	total: 10.8s	remaining: 9.7s
118:	learn: 0.0556411	total: 10.9s	remaining: 9.65s
119:	learn: 0.0552902	total: 11.1s	remaining: 9.61s
120:	learn: 0.0549479	total: 11.2s	remaining: 9.56s
121:	learn: 0.0543669	total: 11.4s	remaining: 9.51s
122:	learn: 0.0540750	total: 11.5s	remaining: 9.46s
123:	learn: 0.0537077	total: 11.7s	remaining: 9.4s
124:	learn: 0.0534191	total: 11.8s	remaining: 9.33s
125:	learn: 0.0530436	total: 11.9s	remaining: 9.26s
126:	learn: 0.0527503	total: 12s	remaining: 9.15s
127:	learn: 0.0525075	total: 12.1s	remaining: 9.05s
128:	learn: 0.0522748	total: 12.1s	remaining: 8.94s
129:	learn: 0.0520547	total: 12.2s	remaining: 8.84s
130:	learn: 0.0517438	total: 12.3s	remaining: 8.73s
131:	learn: 0.0514453	total: 12.4s	remaining: 8.63s
132:	learn: 0.0511357	total: 12.5s	remaining: 8.52s
133:	learn: 0.0506585	total: 12.5s	remaining: 8.42s
134:	learn: 0.0504263	total: 12.6s	remaining: 8.31s
135:	learn: 0.0501147	total: 12.7s	remaining: 8.21s
136:	learn: 0.0497301	total: 12.8s	remaining: 8.1s
137:	learn: 0.0494300	total: 12.8s	remaining: 8s
138:	learn: 0.0491356	total: 12.9s	remaining: 7.89s
139:	learn: 0.0489177	total: 13s	remaining: 7.79s
140:	learn: 0.0486112	total: 13.1s	remaining: 7.7s
141:	learn: 0.0483814	total: 13.1s	remaining: 7.59s
142:	learn: 0.0481163	total: 13.2s	remaining: 7.49s
143:	learn: 0.0478901	total: 13.3s	remaining: 7.39s
144:	learn: 0.0477280	total: 13.4s	remaining: 7.28s
145:	learn: 0.0475243	total: 13.4s	remaining: 7.17s
146:	learn: 0.0473440	total: 13.5s	remaining: 7.08s
147:	learn: 0.0471711	total: 13.6s	remaining: 6.98s
148:	learn: 0.0469254	total: 13.7s	remaining: 6.88s
149:	learn: 0.0467750	total: 13.7s	remaining: 6.78s
150:	learn: 0.0465249	total: 13.8s	remaining: 6.68s
151:	learn: 0.0462717	total: 13.9s	remaining: 6.58s
152:	learn: 0.0460593	total: 14s	remaining: 6.48s
153:	learn: 0.0457790	total: 14.1s	remaining: 6.39s
154:	learn: 0.0454368	total: 14.1s	remaining: 6.29s
155:	learn: 0.0451730	total: 14.2s	remaining: 6.2s
156:	learn: 0.0449321	total: 14.3s	remaining: 6.09s
157:	learn: 0.0447965	total: 14.4s	remaining: 5.99s
158:	learn: 0.0445766	total: 14.4s	remaining: 5.9s
159:	learn: 0.0444274	total: 14.5s	remaining: 5.8s
160:	learn: 0.0442230	total: 14.6s	remaining: 5.7s
161:	learn: 0.0440316	total: 14.7s	remaining: 5.61s
162:	learn: 0.0438744	total: 14.7s	remaining: 5.51s
163:	learn: 0.0436790	total: 14.8s	remaining: 5.41s
164:	learn: 0.0433562	total: 14.9s	remaining: 5.32s
165:	learn: 0.0430917	total: 14.9s	remaining: 5.22s
166:	learn: 0.0429531	total: 15s	remaining: 5.13s
167:	learn: 0.0427440	total: 15.1s	remaining: 5.04s
168:	learn: 0.0426116	total: 15.2s	remaining: 4.94s
169:	learn: 0.0424938	total: 15.3s	remaining: 4.84s
170:	learn: 0.0422301	total: 15.3s	remaining: 4.75s
171:	learn: 0.0419172	total: 15.4s	remaining: 4.66s
172:	learn: 0.0416986	total: 15.5s	remaining: 4.56s
173:	learn: 0.0415069	total: 15.6s	remaining: 4.47s
174:	learn: 0.0413145	total: 15.6s	remaining: 4.38s
175:	learn: 0.0410614	total: 15.7s	remaining: 4.28s
176:	learn: 0.0408757	total: 15.8s	remaining: 4.19s
177:	learn: 0.0408075	total: 15.8s	remaining: 4.09s
178:	learn: 0.0406131	total: 15.9s	remaining: 4s
179:	learn: 0.0404960	total: 16s	remaining: 3.91s
180:	learn: 0.0403185	total: 16.1s	remaining: 3.82s
181:	learn: 0.0401408	total: 16.1s	remaining: 3.73s
182:	learn: 0.0399749	total: 16.2s	remaining: 3.63s
183:	learn: 0.0398558	total: 16.3s	remaining: 3.54s
184:	learn: 0.0397157	total: 16.4s	remaining: 3.45s
185:	learn: 0.0395217	total: 16.4s	remaining: 3.36s
186:	learn: 0.0393997	total: 16.5s	remaining: 3.27s
187:	learn: 0.0392301	total: 16.6s	remaining: 3.17s
188:	learn: 0.0391351	total: 16.7s	remaining: 3.08s
189:	learn: 0.0389090	total: 16.7s	remaining: 2.99s
190:	learn: 0.0387066	total: 16.8s	remaining: 2.9s
191:	learn: 0.0384904	total: 16.9s	remaining: 2.81s
192:	learn: 0.0383569	total: 17s	remaining: 2.72s
193:	learn: 0.0382156	total: 17s	remaining: 2.63s
194:	learn: 0.0380216	total: 17.1s	remaining: 2.55s
195:	learn: 0.0378973	total: 17.2s	remaining: 2.46s
196:	learn: 0.0377152	total: 17.3s	remaining: 2.37s
197:	learn: 0.0375878	total: 17.3s	remaining: 2.28s
198:	learn: 0.0374554	total: 17.4s	remaining: 2.19s
199:	learn: 0.0372787	total: 17.5s	remaining: 2.1s
200:	learn: 0.0371833	total: 17.6s	remaining: 2.01s
201:	learn: 0.0370757	total: 17.6s	remaining: 1.92s
202:	learn: 0.0368856	total: 17.7s	remaining: 1.83s
203:	learn: 0.0367650	total: 17.8s	remaining: 1.74s
204:	learn: 0.0365725	total: 17.9s	remaining: 1.66s
205:	learn: 0.0364254	total: 17.9s	remaining: 1.57s
206:	learn: 0.0362804	total: 18s	remaining: 1.48s
207:	learn: 0.0361166	total: 18.1s	remaining: 1.39s
208:	learn: 0.0359320	total: 18.2s	remaining: 1.3s
209:	learn: 0.0357752	total: 18.3s	remaining: 1.22s
210:	learn: 0.0356463	total: 18.3s	remaining: 1.13s
211:	learn: 0.0354639	total: 18.4s	remaining: 1.04s
212:	learn: 0.0353009	total: 18.5s	remaining: 955ms
213:	learn: 0.0351978	total: 18.6s	remaining: 867ms
214:	learn: 0.0350560	total: 18.6s	remaining: 780ms
215:	learn: 0.0349563	total: 18.7s	remaining: 693ms
216:	learn: 0.0348478	total: 18.8s	remaining: 605ms
217:	learn: 0.0346851	total: 18.8s	remaining: 519ms
218:	learn: 0.0345432	total: 18.9s	remaining: 432ms
219:	learn: 0.0344028	total: 19s	remaining: 345ms
220:	learn: 0.0342354	total: 19.1s	remaining: 259ms
221:	learn: 0.0341191	total: 19.2s	remaining: 173ms
222:	learn: 0.0340396	total: 19.2s	remaining: 86.3ms
223:	learn: 0.0338858	total: 19.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.23
 - Recall_Train: 100.00
 - AUPRC_Train: 99.81
 - Accuracy_Train: 96.37
 - F1-Score_Train: 96.50
 - Precision_Test: 2.21
 - Recall_Test: 96.83
 - AUPRC_Test: 63.73
 - Accuracy_Test: 92.79
 - F1-Score_Test: 4.32
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 224
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.17
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6469155	total: 68.6ms	remaining: 15.3s
1:	learn: 0.5871850	total: 144ms	remaining: 16s
2:	learn: 0.5346526	total: 214ms	remaining: 15.7s
3:	learn: 0.4977070	total: 283ms	remaining: 15.5s
4:	learn: 0.4743074	total: 349ms	remaining: 15.3s
5:	learn: 0.4426596	total: 415ms	remaining: 15.1s
6:	learn: 0.4111594	total: 493ms	remaining: 15.3s
7:	learn: 0.3886754	total: 609ms	remaining: 16.4s
8:	learn: 0.3585350	total: 734ms	remaining: 17.5s
9:	learn: 0.3345930	total: 872ms	remaining: 18.7s
10:	learn: 0.3143735	total: 1.01s	remaining: 19.7s
11:	learn: 0.2952451	total: 1.16s	remaining: 20.4s
12:	learn: 0.2790142	total: 1.3s	remaining: 21s
13:	learn: 0.2626930	total: 1.42s	remaining: 21.3s
14:	learn: 0.2476149	total: 1.56s	remaining: 21.7s
15:	learn: 0.2345208	total: 1.7s	remaining: 22.1s
16:	learn: 0.2221863	total: 1.85s	remaining: 22.5s
17:	learn: 0.2150243	total: 1.98s	remaining: 22.7s
18:	learn: 0.2047583	total: 2.12s	remaining: 22.9s
19:	learn: 0.1935081	total: 2.26s	remaining: 23s
20:	learn: 0.1855835	total: 2.38s	remaining: 23s
21:	learn: 0.1779410	total: 2.51s	remaining: 23s
22:	learn: 0.1699330	total: 2.65s	remaining: 23.2s
23:	learn: 0.1634995	total: 2.77s	remaining: 23.1s
24:	learn: 0.1566656	total: 2.88s	remaining: 22.9s
25:	learn: 0.1511081	total: 3.03s	remaining: 23.1s
26:	learn: 0.1457259	total: 3.18s	remaining: 23.2s
27:	learn: 0.1411468	total: 3.31s	remaining: 23.2s
28:	learn: 0.1367095	total: 3.46s	remaining: 23.3s
29:	learn: 0.1323757	total: 3.6s	remaining: 23.3s
30:	learn: 0.1284308	total: 3.76s	remaining: 23.4s
31:	learn: 0.1244527	total: 3.89s	remaining: 23.3s
32:	learn: 0.1218202	total: 4.03s	remaining: 23.3s
33:	learn: 0.1187028	total: 4.16s	remaining: 23.2s
34:	learn: 0.1163575	total: 4.27s	remaining: 23.1s
35:	learn: 0.1123540	total: 4.41s	remaining: 23s
36:	learn: 0.1106591	total: 4.55s	remaining: 23s
37:	learn: 0.1088015	total: 4.69s	remaining: 23s
38:	learn: 0.1065406	total: 4.83s	remaining: 22.9s
39:	learn: 0.1036545	total: 4.97s	remaining: 22.9s
40:	learn: 0.1022161	total: 5.12s	remaining: 22.8s
41:	learn: 0.1001071	total: 5.25s	remaining: 22.8s
42:	learn: 0.0986097	total: 5.4s	remaining: 22.7s
43:	learn: 0.0965569	total: 5.54s	remaining: 22.7s
44:	learn: 0.0947840	total: 5.68s	remaining: 22.6s
45:	learn: 0.0931502	total: 5.83s	remaining: 22.6s
46:	learn: 0.0922065	total: 5.95s	remaining: 22.4s
47:	learn: 0.0904156	total: 6.03s	remaining: 22.1s
48:	learn: 0.0889641	total: 6.1s	remaining: 21.8s
49:	learn: 0.0876138	total: 6.18s	remaining: 21.5s
50:	learn: 0.0864206	total: 6.27s	remaining: 21.3s
51:	learn: 0.0851884	total: 6.34s	remaining: 21s
52:	learn: 0.0829166	total: 6.43s	remaining: 20.8s
53:	learn: 0.0816182	total: 6.5s	remaining: 20.5s
54:	learn: 0.0802648	total: 6.57s	remaining: 20.2s
55:	learn: 0.0789616	total: 6.65s	remaining: 20s
56:	learn: 0.0779217	total: 6.71s	remaining: 19.7s
57:	learn: 0.0768229	total: 6.78s	remaining: 19.4s
58:	learn: 0.0759108	total: 6.86s	remaining: 19.2s
59:	learn: 0.0750133	total: 6.94s	remaining: 19s
60:	learn: 0.0742406	total: 7.01s	remaining: 18.7s
61:	learn: 0.0733562	total: 7.08s	remaining: 18.5s
62:	learn: 0.0725918	total: 7.17s	remaining: 18.3s
63:	learn: 0.0717247	total: 7.24s	remaining: 18.1s
64:	learn: 0.0709675	total: 7.32s	remaining: 17.9s
65:	learn: 0.0700636	total: 7.4s	remaining: 17.7s
66:	learn: 0.0692872	total: 7.47s	remaining: 17.5s
67:	learn: 0.0685330	total: 7.54s	remaining: 17.3s
68:	learn: 0.0677788	total: 7.62s	remaining: 17.1s
69:	learn: 0.0671710	total: 7.69s	remaining: 16.9s
70:	learn: 0.0665545	total: 7.77s	remaining: 16.8s
71:	learn: 0.0660731	total: 7.85s	remaining: 16.6s
72:	learn: 0.0655753	total: 7.92s	remaining: 16.4s
73:	learn: 0.0650746	total: 7.99s	remaining: 16.2s
74:	learn: 0.0645190	total: 8.06s	remaining: 16s
75:	learn: 0.0641061	total: 8.14s	remaining: 15.8s
76:	learn: 0.0635316	total: 8.21s	remaining: 15.7s
77:	learn: 0.0625925	total: 8.31s	remaining: 15.6s
78:	learn: 0.0621175	total: 8.37s	remaining: 15.4s
79:	learn: 0.0615517	total: 8.45s	remaining: 15.2s
80:	learn: 0.0611100	total: 8.52s	remaining: 15s
81:	learn: 0.0605923	total: 8.59s	remaining: 14.9s
82:	learn: 0.0597910	total: 8.68s	remaining: 14.8s
83:	learn: 0.0592151	total: 8.76s	remaining: 14.6s
84:	learn: 0.0586049	total: 8.83s	remaining: 14.4s
85:	learn: 0.0581252	total: 8.91s	remaining: 14.3s
86:	learn: 0.0573335	total: 8.98s	remaining: 14.1s
87:	learn: 0.0567611	total: 9.05s	remaining: 14s
88:	learn: 0.0561327	total: 9.15s	remaining: 13.9s
89:	learn: 0.0554057	total: 9.22s	remaining: 13.7s
90:	learn: 0.0548069	total: 9.31s	remaining: 13.6s
91:	learn: 0.0544385	total: 9.4s	remaining: 13.5s
92:	learn: 0.0541202	total: 9.47s	remaining: 13.3s
93:	learn: 0.0537391	total: 9.53s	remaining: 13.2s
94:	learn: 0.0533518	total: 9.62s	remaining: 13.1s
95:	learn: 0.0528617	total: 9.69s	remaining: 12.9s
96:	learn: 0.0523910	total: 9.77s	remaining: 12.8s
97:	learn: 0.0519975	total: 9.85s	remaining: 12.7s
98:	learn: 0.0515681	total: 9.92s	remaining: 12.5s
99:	learn: 0.0509920	total: 10s	remaining: 12.4s
100:	learn: 0.0506729	total: 10.1s	remaining: 12.3s
101:	learn: 0.0504000	total: 10.1s	remaining: 12.1s
102:	learn: 0.0498417	total: 10.2s	remaining: 12s
103:	learn: 0.0493909	total: 10.3s	remaining: 11.9s
104:	learn: 0.0490717	total: 10.4s	remaining: 11.8s
105:	learn: 0.0488651	total: 10.4s	remaining: 11.6s
106:	learn: 0.0486353	total: 10.5s	remaining: 11.5s
107:	learn: 0.0483403	total: 10.6s	remaining: 11.4s
108:	learn: 0.0481266	total: 10.7s	remaining: 11.2s
109:	learn: 0.0478671	total: 10.7s	remaining: 11.1s
110:	learn: 0.0475797	total: 10.8s	remaining: 11s
111:	learn: 0.0473217	total: 10.9s	remaining: 10.9s
112:	learn: 0.0471151	total: 10.9s	remaining: 10.7s
113:	learn: 0.0465779	total: 11s	remaining: 10.6s
114:	learn: 0.0461732	total: 11.1s	remaining: 10.5s
115:	learn: 0.0459798	total: 11.2s	remaining: 10.4s
116:	learn: 0.0456324	total: 11.2s	remaining: 10.3s
117:	learn: 0.0454376	total: 11.3s	remaining: 10.2s
118:	learn: 0.0450599	total: 11.4s	remaining: 10.1s
119:	learn: 0.0448705	total: 11.5s	remaining: 9.95s
120:	learn: 0.0445443	total: 11.6s	remaining: 9.84s
121:	learn: 0.0441898	total: 11.6s	remaining: 9.73s
122:	learn: 0.0438510	total: 11.7s	remaining: 9.62s
123:	learn: 0.0436557	total: 11.8s	remaining: 9.5s
124:	learn: 0.0433310	total: 11.9s	remaining: 9.39s
125:	learn: 0.0430761	total: 11.9s	remaining: 9.28s
126:	learn: 0.0428263	total: 12s	remaining: 9.16s
127:	learn: 0.0426307	total: 12.1s	remaining: 9.06s
128:	learn: 0.0423960	total: 12.2s	remaining: 8.95s
129:	learn: 0.0421699	total: 12.2s	remaining: 8.84s
130:	learn: 0.0419106	total: 12.3s	remaining: 8.74s
131:	learn: 0.0417314	total: 12.4s	remaining: 8.64s
132:	learn: 0.0415515	total: 12.5s	remaining: 8.53s
133:	learn: 0.0414138	total: 12.5s	remaining: 8.43s
134:	learn: 0.0412168	total: 12.6s	remaining: 8.32s
135:	learn: 0.0408850	total: 12.7s	remaining: 8.21s
136:	learn: 0.0405949	total: 12.8s	remaining: 8.12s
137:	learn: 0.0401717	total: 12.9s	remaining: 8.01s
138:	learn: 0.0400623	total: 12.9s	remaining: 7.9s
139:	learn: 0.0397925	total: 13s	remaining: 7.8s
140:	learn: 0.0395498	total: 13.1s	remaining: 7.71s
141:	learn: 0.0393352	total: 13.2s	remaining: 7.6s
142:	learn: 0.0390858	total: 13.2s	remaining: 7.5s
143:	learn: 0.0387849	total: 13.3s	remaining: 7.39s
144:	learn: 0.0384565	total: 13.4s	remaining: 7.29s
145:	learn: 0.0382776	total: 13.5s	remaining: 7.19s
146:	learn: 0.0380453	total: 13.5s	remaining: 7.09s
147:	learn: 0.0379274	total: 13.6s	remaining: 6.99s
148:	learn: 0.0376416	total: 13.7s	remaining: 6.89s
149:	learn: 0.0374910	total: 13.8s	remaining: 6.79s
150:	learn: 0.0372722	total: 13.8s	remaining: 6.69s
151:	learn: 0.0369459	total: 13.9s	remaining: 6.59s
152:	learn: 0.0367364	total: 14s	remaining: 6.49s
153:	learn: 0.0365412	total: 14.1s	remaining: 6.4s
154:	learn: 0.0364200	total: 14.2s	remaining: 6.3s
155:	learn: 0.0362190	total: 14.2s	remaining: 6.2s
156:	learn: 0.0361056	total: 14.3s	remaining: 6.1s
157:	learn: 0.0358935	total: 14.4s	remaining: 6s
158:	learn: 0.0357564	total: 14.5s	remaining: 5.91s
159:	learn: 0.0356027	total: 14.5s	remaining: 5.81s
160:	learn: 0.0353578	total: 14.6s	remaining: 5.71s
161:	learn: 0.0352411	total: 14.7s	remaining: 5.62s
162:	learn: 0.0350571	total: 14.8s	remaining: 5.52s
163:	learn: 0.0349325	total: 14.8s	remaining: 5.42s
164:	learn: 0.0348072	total: 14.9s	remaining: 5.33s
165:	learn: 0.0346176	total: 15s	remaining: 5.23s
166:	learn: 0.0344777	total: 15.1s	remaining: 5.14s
167:	learn: 0.0343575	total: 15.1s	remaining: 5.05s
168:	learn: 0.0342478	total: 15.2s	remaining: 4.95s
169:	learn: 0.0341293	total: 15.3s	remaining: 4.85s
170:	learn: 0.0339913	total: 15.4s	remaining: 4.76s
171:	learn: 0.0338263	total: 15.4s	remaining: 4.66s
172:	learn: 0.0336650	total: 15.5s	remaining: 4.57s
173:	learn: 0.0335507	total: 15.6s	remaining: 4.48s
174:	learn: 0.0333994	total: 15.7s	remaining: 4.39s
175:	learn: 0.0332322	total: 15.7s	remaining: 4.29s
176:	learn: 0.0330713	total: 15.8s	remaining: 4.2s
177:	learn: 0.0329877	total: 15.9s	remaining: 4.1s
178:	learn: 0.0327229	total: 16s	remaining: 4.02s
179:	learn: 0.0325340	total: 16.1s	remaining: 3.94s
180:	learn: 0.0324254	total: 16.3s	remaining: 3.86s
181:	learn: 0.0322241	total: 16.4s	remaining: 3.79s
182:	learn: 0.0320876	total: 16.5s	remaining: 3.71s
183:	learn: 0.0319743	total: 16.7s	remaining: 3.63s
184:	learn: 0.0318728	total: 16.8s	remaining: 3.55s
185:	learn: 0.0317725	total: 17s	remaining: 3.47s
186:	learn: 0.0316450	total: 17.1s	remaining: 3.39s
187:	learn: 0.0314404	total: 17.3s	remaining: 3.31s
188:	learn: 0.0313342	total: 17.4s	remaining: 3.23s
189:	learn: 0.0312054	total: 17.6s	remaining: 3.14s
190:	learn: 0.0310660	total: 17.7s	remaining: 3.06s
191:	learn: 0.0309326	total: 17.8s	remaining: 2.97s
192:	learn: 0.0308419	total: 17.9s	remaining: 2.88s
193:	learn: 0.0306242	total: 18.1s	remaining: 2.8s
194:	learn: 0.0305031	total: 18.2s	remaining: 2.71s
195:	learn: 0.0303512	total: 18.4s	remaining: 2.62s
196:	learn: 0.0302194	total: 18.5s	remaining: 2.53s
197:	learn: 0.0300437	total: 18.6s	remaining: 2.45s
198:	learn: 0.0299539	total: 18.8s	remaining: 2.36s
199:	learn: 0.0298331	total: 18.9s	remaining: 2.27s
200:	learn: 0.0297108	total: 19.1s	remaining: 2.18s
201:	learn: 0.0295997	total: 19.2s	remaining: 2.09s
202:	learn: 0.0294247	total: 19.4s	remaining: 2s
203:	learn: 0.0293401	total: 19.5s	remaining: 1.91s
204:	learn: 0.0292751	total: 19.6s	remaining: 1.82s
205:	learn: 0.0290707	total: 19.7s	remaining: 1.73s
206:	learn: 0.0289520	total: 19.9s	remaining: 1.63s
207:	learn: 0.0288791	total: 20s	remaining: 1.54s
208:	learn: 0.0288094	total: 20.2s	remaining: 1.45s
209:	learn: 0.0286521	total: 20.3s	remaining: 1.35s
210:	learn: 0.0285904	total: 20.5s	remaining: 1.26s
211:	learn: 0.0284858	total: 20.6s	remaining: 1.17s
212:	learn: 0.0284062	total: 20.8s	remaining: 1.07s
213:	learn: 0.0283219	total: 20.9s	remaining: 978ms
214:	learn: 0.0281554	total: 21.1s	remaining: 882ms
215:	learn: 0.0279968	total: 21.2s	remaining: 786ms
216:	learn: 0.0278852	total: 21.3s	remaining: 688ms
217:	learn: 0.0277671	total: 21.4s	remaining: 589ms
218:	learn: 0.0276757	total: 21.5s	remaining: 490ms
219:	learn: 0.0275674	total: 21.5s	remaining: 392ms
220:	learn: 0.0274475	total: 21.6s	remaining: 293ms
221:	learn: 0.0273929	total: 21.7s	remaining: 195ms
222:	learn: 0.0272675	total: 21.8s	remaining: 97.6ms
223:	learn: 0.0271548	total: 21.8s	remaining: 0us
[I 2024-12-19 14:08:49,372] Trial 1 finished with value: 64.77081821400259 and parameters: {'learning_rate': 0.019464969464643017, 'max_depth': 4, 'n_estimators': 224, 'scale_pos_weight': 10.167674687462965}. Best is trial 0 with value: 71.43498362547278.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.60
 - Recall_Train: 100.00
 - AUPRC_Train: 99.81
 - Accuracy_Train: 97.14
 - F1-Score_Train: 97.22
 - Precision_Test: 2.57
 - Recall_Test: 90.48
 - AUPRC_Test: 65.63
 - Accuracy_Test: 94.22
 - F1-Score_Test: 5.00
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 224
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.17
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 64.7708

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5916089	total: 80.1ms	remaining: 16.2s
1:	learn: 0.5085394	total: 160ms	remaining: 16.1s
2:	learn: 0.4309394	total: 242ms	remaining: 16.1s
3:	learn: 0.3728776	total: 333ms	remaining: 16.6s
4:	learn: 0.3158271	total: 414ms	remaining: 16.4s
5:	learn: 0.2710321	total: 504ms	remaining: 16.5s
6:	learn: 0.2298193	total: 596ms	remaining: 16.7s
7:	learn: 0.2084492	total: 675ms	remaining: 16.5s
8:	learn: 0.1848003	total: 757ms	remaining: 16.3s
9:	learn: 0.1728458	total: 842ms	remaining: 16.2s
10:	learn: 0.1523918	total: 942ms	remaining: 16.4s
11:	learn: 0.1409369	total: 1.02s	remaining: 16.2s
12:	learn: 0.1274272	total: 1.11s	remaining: 16.2s
13:	learn: 0.1161343	total: 1.2s	remaining: 16.2s
14:	learn: 0.1075911	total: 1.29s	remaining: 16.1s
15:	learn: 0.1017656	total: 1.38s	remaining: 16.1s
16:	learn: 0.0966075	total: 1.45s	remaining: 15.9s
17:	learn: 0.0927077	total: 1.54s	remaining: 15.8s
18:	learn: 0.0880417	total: 1.63s	remaining: 15.8s
19:	learn: 0.0828701	total: 1.72s	remaining: 15.7s
20:	learn: 0.0789553	total: 1.8s	remaining: 15.6s
21:	learn: 0.0758359	total: 1.89s	remaining: 15.5s
22:	learn: 0.0710040	total: 1.98s	remaining: 15.5s
23:	learn: 0.0686081	total: 2.06s	remaining: 15.4s
24:	learn: 0.0666759	total: 2.15s	remaining: 15.3s
25:	learn: 0.0644053	total: 2.23s	remaining: 15.2s
26:	learn: 0.0624312	total: 2.31s	remaining: 15.1s
27:	learn: 0.0590657	total: 2.4s	remaining: 15s
28:	learn: 0.0563174	total: 2.48s	remaining: 14.9s
29:	learn: 0.0540393	total: 2.56s	remaining: 14.8s
30:	learn: 0.0520374	total: 2.66s	remaining: 14.7s
31:	learn: 0.0498731	total: 2.74s	remaining: 14.7s
32:	learn: 0.0484923	total: 2.82s	remaining: 14.5s
33:	learn: 0.0469126	total: 2.9s	remaining: 14.4s
34:	learn: 0.0457756	total: 3s	remaining: 14.4s
35:	learn: 0.0443951	total: 3.09s	remaining: 14.3s
36:	learn: 0.0432053	total: 3.18s	remaining: 14.3s
37:	learn: 0.0420385	total: 3.27s	remaining: 14.2s
38:	learn: 0.0409203	total: 3.35s	remaining: 14.1s
39:	learn: 0.0400310	total: 3.44s	remaining: 14s
40:	learn: 0.0391778	total: 3.51s	remaining: 13.9s
41:	learn: 0.0382759	total: 3.6s	remaining: 13.8s
42:	learn: 0.0376023	total: 3.69s	remaining: 13.7s
43:	learn: 0.0369534	total: 3.77s	remaining: 13.6s
44:	learn: 0.0363274	total: 3.85s	remaining: 13.5s
45:	learn: 0.0357299	total: 3.94s	remaining: 13.5s
46:	learn: 0.0352367	total: 4.03s	remaining: 13.4s
47:	learn: 0.0344529	total: 4.12s	remaining: 13.3s
48:	learn: 0.0338570	total: 4.21s	remaining: 13.2s
49:	learn: 0.0333493	total: 4.29s	remaining: 13.1s
50:	learn: 0.0327848	total: 4.38s	remaining: 13.1s
51:	learn: 0.0324017	total: 4.47s	remaining: 13s
52:	learn: 0.0320415	total: 4.55s	remaining: 12.9s
53:	learn: 0.0315281	total: 4.63s	remaining: 12.8s
54:	learn: 0.0310460	total: 4.72s	remaining: 12.7s
55:	learn: 0.0306412	total: 4.79s	remaining: 12.6s
56:	learn: 0.0302914	total: 4.86s	remaining: 12.5s
57:	learn: 0.0299802	total: 4.95s	remaining: 12.4s
58:	learn: 0.0296579	total: 5.04s	remaining: 12.3s
59:	learn: 0.0292412	total: 5.12s	remaining: 12.2s
60:	learn: 0.0288462	total: 5.21s	remaining: 12.1s
61:	learn: 0.0283663	total: 5.29s	remaining: 12s
62:	learn: 0.0279209	total: 5.37s	remaining: 11.9s
63:	learn: 0.0275519	total: 5.46s	remaining: 11.9s
64:	learn: 0.0272015	total: 5.54s	remaining: 11.8s
65:	learn: 0.0267042	total: 5.62s	remaining: 11.7s
66:	learn: 0.0264160	total: 5.71s	remaining: 11.6s
67:	learn: 0.0260822	total: 5.78s	remaining: 11.5s
68:	learn: 0.0257034	total: 5.87s	remaining: 11.4s
69:	learn: 0.0252572	total: 5.95s	remaining: 11.3s
70:	learn: 0.0250343	total: 6.02s	remaining: 11.2s
71:	learn: 0.0247606	total: 6.12s	remaining: 11.1s
72:	learn: 0.0244879	total: 6.21s	remaining: 11.1s
73:	learn: 0.0240772	total: 6.29s	remaining: 11s
74:	learn: 0.0237437	total: 6.37s	remaining: 10.9s
75:	learn: 0.0234144	total: 6.46s	remaining: 10.8s
76:	learn: 0.0232403	total: 6.53s	remaining: 10.7s
77:	learn: 0.0229542	total: 6.61s	remaining: 10.6s
78:	learn: 0.0226020	total: 6.69s	remaining: 10.5s
79:	learn: 0.0223433	total: 6.77s	remaining: 10.4s
80:	learn: 0.0221480	total: 6.86s	remaining: 10.3s
81:	learn: 0.0218881	total: 6.95s	remaining: 10.3s
82:	learn: 0.0217028	total: 7.03s	remaining: 10.2s
83:	learn: 0.0214560	total: 7.13s	remaining: 10.1s
84:	learn: 0.0211944	total: 7.22s	remaining: 10s
85:	learn: 0.0209452	total: 7.29s	remaining: 9.91s
86:	learn: 0.0208004	total: 7.41s	remaining: 9.88s
87:	learn: 0.0205937	total: 7.53s	remaining: 9.84s
88:	learn: 0.0202461	total: 7.69s	remaining: 9.85s
89:	learn: 0.0200295	total: 7.84s	remaining: 9.84s
90:	learn: 0.0198736	total: 7.99s	remaining: 9.83s
91:	learn: 0.0195684	total: 8.15s	remaining: 9.84s
92:	learn: 0.0193161	total: 8.29s	remaining: 9.81s
93:	learn: 0.0191409	total: 8.45s	remaining: 9.8s
94:	learn: 0.0188923	total: 8.61s	remaining: 9.79s
95:	learn: 0.0187822	total: 8.76s	remaining: 9.77s
96:	learn: 0.0184510	total: 8.91s	remaining: 9.74s
97:	learn: 0.0182367	total: 9.05s	remaining: 9.7s
98:	learn: 0.0179876	total: 9.22s	remaining: 9.68s
99:	learn: 0.0178187	total: 9.36s	remaining: 9.64s
100:	learn: 0.0176858	total: 9.51s	remaining: 9.61s
101:	learn: 0.0174850	total: 9.68s	remaining: 9.58s
102:	learn: 0.0173221	total: 9.83s	remaining: 9.55s
103:	learn: 0.0171264	total: 9.99s	remaining: 9.51s
104:	learn: 0.0170317	total: 10.1s	remaining: 9.47s
105:	learn: 0.0168309	total: 10.3s	remaining: 9.44s
106:	learn: 0.0166995	total: 10.5s	remaining: 9.39s
107:	learn: 0.0166002	total: 10.6s	remaining: 9.34s
108:	learn: 0.0164570	total: 10.8s	remaining: 9.29s
109:	learn: 0.0163293	total: 10.9s	remaining: 9.21s
110:	learn: 0.0161828	total: 11.1s	remaining: 9.16s
111:	learn: 0.0160548	total: 11.2s	remaining: 9.11s
112:	learn: 0.0159120	total: 11.4s	remaining: 9.05s
113:	learn: 0.0157780	total: 11.5s	remaining: 9s
114:	learn: 0.0156027	total: 11.7s	remaining: 8.94s
115:	learn: 0.0154708	total: 11.8s	remaining: 8.88s
116:	learn: 0.0153657	total: 12s	remaining: 8.81s
117:	learn: 0.0152253	total: 12.1s	remaining: 8.75s
118:	learn: 0.0150942	total: 12.3s	remaining: 8.68s
119:	learn: 0.0149858	total: 12.4s	remaining: 8.6s
120:	learn: 0.0148906	total: 12.6s	remaining: 8.52s
121:	learn: 0.0147494	total: 12.7s	remaining: 8.4s
122:	learn: 0.0146693	total: 12.7s	remaining: 8.28s
123:	learn: 0.0145690	total: 12.8s	remaining: 8.16s
124:	learn: 0.0144479	total: 12.9s	remaining: 8.05s
125:	learn: 0.0143371	total: 13s	remaining: 7.93s
126:	learn: 0.0141637	total: 13.1s	remaining: 7.82s
127:	learn: 0.0140050	total: 13.2s	remaining: 7.71s
128:	learn: 0.0138627	total: 13.2s	remaining: 7.59s
129:	learn: 0.0137144	total: 13.3s	remaining: 7.47s
130:	learn: 0.0136300	total: 13.4s	remaining: 7.37s
131:	learn: 0.0135374	total: 13.5s	remaining: 7.26s
132:	learn: 0.0134341	total: 13.6s	remaining: 7.14s
133:	learn: 0.0133003	total: 13.7s	remaining: 7.03s
134:	learn: 0.0131735	total: 13.7s	remaining: 6.91s
135:	learn: 0.0130633	total: 13.8s	remaining: 6.8s
136:	learn: 0.0129754	total: 13.9s	remaining: 6.69s
137:	learn: 0.0128328	total: 14s	remaining: 6.58s
138:	learn: 0.0127458	total: 14.1s	remaining: 6.47s
139:	learn: 0.0126545	total: 14.1s	remaining: 6.36s
140:	learn: 0.0125083	total: 14.2s	remaining: 6.25s
141:	learn: 0.0123300	total: 14.3s	remaining: 6.14s
142:	learn: 0.0122142	total: 14.4s	remaining: 6.03s
143:	learn: 0.0121293	total: 14.5s	remaining: 5.93s
144:	learn: 0.0120470	total: 14.5s	remaining: 5.82s
145:	learn: 0.0119433	total: 14.6s	remaining: 5.71s
146:	learn: 0.0118281	total: 14.7s	remaining: 5.6s
147:	learn: 0.0117314	total: 14.8s	remaining: 5.49s
148:	learn: 0.0116501	total: 14.9s	remaining: 5.38s
149:	learn: 0.0115678	total: 14.9s	remaining: 5.28s
150:	learn: 0.0114938	total: 15s	remaining: 5.17s
151:	learn: 0.0114237	total: 15.1s	remaining: 5.07s
152:	learn: 0.0113479	total: 15.2s	remaining: 4.96s
153:	learn: 0.0112722	total: 15.3s	remaining: 4.86s
154:	learn: 0.0111554	total: 15.4s	remaining: 4.75s
155:	learn: 0.0110578	total: 15.4s	remaining: 4.65s
156:	learn: 0.0109163	total: 15.5s	remaining: 4.55s
157:	learn: 0.0108150	total: 15.6s	remaining: 4.45s
158:	learn: 0.0107131	total: 15.7s	remaining: 4.34s
159:	learn: 0.0106392	total: 15.8s	remaining: 4.24s
160:	learn: 0.0105373	total: 15.9s	remaining: 4.14s
161:	learn: 0.0104645	total: 15.9s	remaining: 4.03s
162:	learn: 0.0103965	total: 16s	remaining: 3.93s
163:	learn: 0.0103285	total: 16.1s	remaining: 3.83s
164:	learn: 0.0102775	total: 16.2s	remaining: 3.73s
165:	learn: 0.0102261	total: 16.3s	remaining: 3.62s
166:	learn: 0.0101440	total: 16.3s	remaining: 3.52s
167:	learn: 0.0100350	total: 16.4s	remaining: 3.42s
168:	learn: 0.0099899	total: 16.5s	remaining: 3.32s
169:	learn: 0.0099453	total: 16.6s	remaining: 3.22s
170:	learn: 0.0098945	total: 16.7s	remaining: 3.12s
171:	learn: 0.0097858	total: 16.8s	remaining: 3.02s
172:	learn: 0.0097303	total: 16.8s	remaining: 2.92s
173:	learn: 0.0096740	total: 16.9s	remaining: 2.82s
174:	learn: 0.0096183	total: 17s	remaining: 2.72s
175:	learn: 0.0095671	total: 17.1s	remaining: 2.62s
176:	learn: 0.0095171	total: 17.2s	remaining: 2.52s
177:	learn: 0.0094496	total: 17.2s	remaining: 2.42s
178:	learn: 0.0093937	total: 17.3s	remaining: 2.32s
179:	learn: 0.0093457	total: 17.4s	remaining: 2.22s
180:	learn: 0.0092897	total: 17.5s	remaining: 2.12s
181:	learn: 0.0092280	total: 17.6s	remaining: 2.03s
182:	learn: 0.0091747	total: 17.6s	remaining: 1.93s
183:	learn: 0.0090984	total: 17.7s	remaining: 1.83s
184:	learn: 0.0090545	total: 17.8s	remaining: 1.73s
185:	learn: 0.0089477	total: 17.9s	remaining: 1.63s
186:	learn: 0.0088795	total: 17.9s	remaining: 1.53s
187:	learn: 0.0088139	total: 18s	remaining: 1.44s
188:	learn: 0.0087331	total: 18.1s	remaining: 1.34s
189:	learn: 0.0086815	total: 18.2s	remaining: 1.25s
190:	learn: 0.0085837	total: 18.3s	remaining: 1.15s
191:	learn: 0.0085197	total: 18.4s	remaining: 1.05s
192:	learn: 0.0084707	total: 18.4s	remaining: 956ms
193:	learn: 0.0083867	total: 18.5s	remaining: 860ms
194:	learn: 0.0083295	total: 18.6s	remaining: 764ms
195:	learn: 0.0082677	total: 18.7s	remaining: 668ms
196:	learn: 0.0082290	total: 18.8s	remaining: 572ms
197:	learn: 0.0081434	total: 18.9s	remaining: 476ms
198:	learn: 0.0080650	total: 18.9s	remaining: 381ms
199:	learn: 0.0080291	total: 19s	remaining: 285ms
200:	learn: 0.0079588	total: 19.1s	remaining: 190ms
201:	learn: 0.0079018	total: 19.2s	remaining: 95.1ms
202:	learn: 0.0078467	total: 19.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.37
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.17
 - F1-Score_Train: 99.18
 - Precision_Test: 7.89
 - Recall_Test: 88.10
 - AUPRC_Test: 68.46
 - Accuracy_Test: 98.25
 - F1-Score_Test: 14.48
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 203
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5956403	total: 80.6ms	remaining: 16.3s
1:	learn: 0.5153592	total: 153ms	remaining: 15.4s
2:	learn: 0.4383207	total: 236ms	remaining: 15.7s
3:	learn: 0.3812338	total: 338ms	remaining: 16.8s
4:	learn: 0.3344227	total: 417ms	remaining: 16.5s
5:	learn: 0.2896466	total: 502ms	remaining: 16.5s
6:	learn: 0.2639259	total: 595ms	remaining: 16.6s
7:	learn: 0.2354605	total: 681ms	remaining: 16.6s
8:	learn: 0.2135856	total: 768ms	remaining: 16.6s
9:	learn: 0.2023229	total: 851ms	remaining: 16.4s
10:	learn: 0.1870929	total: 925ms	remaining: 16.2s
11:	learn: 0.1726294	total: 1s	remaining: 16s
12:	learn: 0.1608249	total: 1.08s	remaining: 15.9s
13:	learn: 0.1544012	total: 1.16s	remaining: 15.7s
14:	learn: 0.1446056	total: 1.27s	remaining: 15.9s
15:	learn: 0.1366072	total: 1.4s	remaining: 16.4s
16:	learn: 0.1317113	total: 1.53s	remaining: 16.8s
17:	learn: 0.1254558	total: 1.68s	remaining: 17.2s
18:	learn: 0.1194813	total: 1.83s	remaining: 17.8s
19:	learn: 0.1142303	total: 2.01s	remaining: 18.4s
20:	learn: 0.1074854	total: 2.16s	remaining: 18.7s
21:	learn: 0.1025747	total: 2.32s	remaining: 19.1s
22:	learn: 0.0984744	total: 2.48s	remaining: 19.4s
23:	learn: 0.0942022	total: 2.64s	remaining: 19.7s
24:	learn: 0.0907792	total: 2.79s	remaining: 19.9s
25:	learn: 0.0878294	total: 2.93s	remaining: 19.9s
26:	learn: 0.0848690	total: 3.08s	remaining: 20.1s
27:	learn: 0.0829800	total: 3.24s	remaining: 20.2s
28:	learn: 0.0797433	total: 3.4s	remaining: 20.4s
29:	learn: 0.0776070	total: 3.57s	remaining: 20.6s
30:	learn: 0.0752674	total: 3.73s	remaining: 20.7s
31:	learn: 0.0729337	total: 3.91s	remaining: 20.9s
32:	learn: 0.0712197	total: 4.06s	remaining: 20.9s
33:	learn: 0.0692960	total: 4.24s	remaining: 21.1s
34:	learn: 0.0676322	total: 4.41s	remaining: 21.2s
35:	learn: 0.0661975	total: 4.58s	remaining: 21.3s
36:	learn: 0.0652667	total: 4.74s	remaining: 21.3s
37:	learn: 0.0637898	total: 4.89s	remaining: 21.2s
38:	learn: 0.0629703	total: 5.05s	remaining: 21.2s
39:	learn: 0.0615308	total: 5.21s	remaining: 21.3s
40:	learn: 0.0606672	total: 5.35s	remaining: 21.2s
41:	learn: 0.0595843	total: 5.53s	remaining: 21.2s
42:	learn: 0.0587266	total: 5.7s	remaining: 21.2s
43:	learn: 0.0573856	total: 5.87s	remaining: 21.2s
44:	learn: 0.0563032	total: 6s	remaining: 21.1s
45:	learn: 0.0555707	total: 6.09s	remaining: 20.8s
46:	learn: 0.0544926	total: 6.18s	remaining: 20.5s
47:	learn: 0.0535539	total: 6.25s	remaining: 20.2s
48:	learn: 0.0525204	total: 6.34s	remaining: 19.9s
49:	learn: 0.0515147	total: 6.42s	remaining: 19.6s
50:	learn: 0.0506827	total: 6.5s	remaining: 19.4s
51:	learn: 0.0497133	total: 6.59s	remaining: 19.1s
52:	learn: 0.0490567	total: 6.69s	remaining: 18.9s
53:	learn: 0.0485404	total: 6.77s	remaining: 18.7s
54:	learn: 0.0477929	total: 6.85s	remaining: 18.4s
55:	learn: 0.0466449	total: 6.93s	remaining: 18.2s
56:	learn: 0.0462070	total: 7.01s	remaining: 17.9s
57:	learn: 0.0455893	total: 7.09s	remaining: 17.7s
58:	learn: 0.0449393	total: 7.18s	remaining: 17.5s
59:	learn: 0.0442635	total: 7.26s	remaining: 17.3s
60:	learn: 0.0435800	total: 7.35s	remaining: 17.1s
61:	learn: 0.0428516	total: 7.43s	remaining: 16.9s
62:	learn: 0.0422935	total: 7.5s	remaining: 16.7s
63:	learn: 0.0415559	total: 7.6s	remaining: 16.5s
64:	learn: 0.0408784	total: 7.69s	remaining: 16.3s
65:	learn: 0.0402371	total: 7.78s	remaining: 16.1s
66:	learn: 0.0398355	total: 7.85s	remaining: 15.9s
67:	learn: 0.0393804	total: 7.93s	remaining: 15.8s
68:	learn: 0.0387306	total: 8.01s	remaining: 15.6s
69:	learn: 0.0382894	total: 8.1s	remaining: 15.4s
70:	learn: 0.0375886	total: 8.18s	remaining: 15.2s
71:	learn: 0.0371571	total: 8.26s	remaining: 15s
72:	learn: 0.0367453	total: 8.34s	remaining: 14.9s
73:	learn: 0.0364590	total: 8.42s	remaining: 14.7s
74:	learn: 0.0360827	total: 8.49s	remaining: 14.5s
75:	learn: 0.0356391	total: 8.57s	remaining: 14.3s
76:	learn: 0.0352327	total: 8.67s	remaining: 14.2s
77:	learn: 0.0348708	total: 8.76s	remaining: 14s
78:	learn: 0.0343599	total: 8.84s	remaining: 13.9s
79:	learn: 0.0339860	total: 8.92s	remaining: 13.7s
80:	learn: 0.0335016	total: 9s	remaining: 13.6s
81:	learn: 0.0331837	total: 9.09s	remaining: 13.4s
82:	learn: 0.0326146	total: 9.17s	remaining: 13.3s
83:	learn: 0.0323371	total: 9.24s	remaining: 13.1s
84:	learn: 0.0319997	total: 9.32s	remaining: 12.9s
85:	learn: 0.0316887	total: 9.4s	remaining: 12.8s
86:	learn: 0.0312431	total: 9.47s	remaining: 12.6s
87:	learn: 0.0307804	total: 9.56s	remaining: 12.5s
88:	learn: 0.0304924	total: 9.65s	remaining: 12.4s
89:	learn: 0.0301118	total: 9.72s	remaining: 12.2s
90:	learn: 0.0297672	total: 9.83s	remaining: 12.1s
91:	learn: 0.0295469	total: 9.9s	remaining: 11.9s
92:	learn: 0.0292610	total: 9.98s	remaining: 11.8s
93:	learn: 0.0289572	total: 10.1s	remaining: 11.7s
94:	learn: 0.0285677	total: 10.1s	remaining: 11.5s
95:	learn: 0.0283206	total: 10.2s	remaining: 11.4s
96:	learn: 0.0280367	total: 10.3s	remaining: 11.3s
97:	learn: 0.0276517	total: 10.4s	remaining: 11.1s
98:	learn: 0.0273703	total: 10.5s	remaining: 11s
99:	learn: 0.0270802	total: 10.6s	remaining: 10.9s
100:	learn: 0.0268532	total: 10.6s	remaining: 10.7s
101:	learn: 0.0264602	total: 10.7s	remaining: 10.6s
102:	learn: 0.0262976	total: 10.8s	remaining: 10.5s
103:	learn: 0.0259767	total: 10.9s	remaining: 10.4s
104:	learn: 0.0255979	total: 11s	remaining: 10.2s
105:	learn: 0.0253457	total: 11.1s	remaining: 10.1s
106:	learn: 0.0251068	total: 11.1s	remaining: 10s
107:	learn: 0.0247511	total: 11.2s	remaining: 9.88s
108:	learn: 0.0245192	total: 11.3s	remaining: 9.75s
109:	learn: 0.0243021	total: 11.4s	remaining: 9.62s
110:	learn: 0.0240430	total: 11.5s	remaining: 9.5s
111:	learn: 0.0237790	total: 11.6s	remaining: 9.39s
112:	learn: 0.0236187	total: 11.6s	remaining: 9.27s
113:	learn: 0.0233369	total: 11.7s	remaining: 9.14s
114:	learn: 0.0230419	total: 11.8s	remaining: 9.04s
115:	learn: 0.0228722	total: 11.9s	remaining: 8.92s
116:	learn: 0.0226151	total: 12s	remaining: 8.79s
117:	learn: 0.0224509	total: 12.1s	remaining: 8.68s
118:	learn: 0.0222559	total: 12.1s	remaining: 8.56s
119:	learn: 0.0220009	total: 12.2s	remaining: 8.44s
120:	learn: 0.0218087	total: 12.3s	remaining: 8.32s
121:	learn: 0.0216867	total: 12.4s	remaining: 8.2s
122:	learn: 0.0215182	total: 12.4s	remaining: 8.09s
123:	learn: 0.0212169	total: 12.5s	remaining: 7.98s
124:	learn: 0.0210331	total: 12.6s	remaining: 7.87s
125:	learn: 0.0209190	total: 12.7s	remaining: 7.75s
126:	learn: 0.0206489	total: 12.8s	remaining: 7.64s
127:	learn: 0.0204712	total: 12.9s	remaining: 7.54s
128:	learn: 0.0203050	total: 13s	remaining: 7.43s
129:	learn: 0.0201476	total: 13s	remaining: 7.32s
130:	learn: 0.0200038	total: 13.1s	remaining: 7.21s
131:	learn: 0.0197983	total: 13.2s	remaining: 7.1s
132:	learn: 0.0195636	total: 13.3s	remaining: 6.99s
133:	learn: 0.0193161	total: 13.4s	remaining: 6.88s
134:	learn: 0.0191356	total: 13.4s	remaining: 6.77s
135:	learn: 0.0190122	total: 13.5s	remaining: 6.66s
136:	learn: 0.0188467	total: 13.6s	remaining: 6.55s
137:	learn: 0.0186870	total: 13.7s	remaining: 6.45s
138:	learn: 0.0184756	total: 13.8s	remaining: 6.34s
139:	learn: 0.0183696	total: 13.9s	remaining: 6.24s
140:	learn: 0.0182388	total: 13.9s	remaining: 6.13s
141:	learn: 0.0180139	total: 14s	remaining: 6.03s
142:	learn: 0.0178295	total: 14.1s	remaining: 5.92s
143:	learn: 0.0176457	total: 14.2s	remaining: 5.81s
144:	learn: 0.0175162	total: 14.3s	remaining: 5.71s
145:	learn: 0.0173490	total: 14.3s	remaining: 5.6s
146:	learn: 0.0172181	total: 14.4s	remaining: 5.5s
147:	learn: 0.0171086	total: 14.5s	remaining: 5.39s
148:	learn: 0.0169551	total: 14.6s	remaining: 5.29s
149:	learn: 0.0168259	total: 14.7s	remaining: 5.18s
150:	learn: 0.0166639	total: 14.8s	remaining: 5.09s
151:	learn: 0.0165485	total: 14.8s	remaining: 4.98s
152:	learn: 0.0163867	total: 14.9s	remaining: 4.88s
153:	learn: 0.0163197	total: 15s	remaining: 4.78s
154:	learn: 0.0162320	total: 15.1s	remaining: 4.67s
155:	learn: 0.0160409	total: 15.2s	remaining: 4.57s
156:	learn: 0.0158952	total: 15.3s	remaining: 4.47s
157:	learn: 0.0157936	total: 15.3s	remaining: 4.37s
158:	learn: 0.0156269	total: 15.4s	remaining: 4.27s
159:	learn: 0.0154736	total: 15.5s	remaining: 4.17s
160:	learn: 0.0153837	total: 15.6s	remaining: 4.07s
161:	learn: 0.0153174	total: 15.7s	remaining: 3.96s
162:	learn: 0.0152143	total: 15.7s	remaining: 3.86s
163:	learn: 0.0150335	total: 15.8s	remaining: 3.76s
164:	learn: 0.0148961	total: 15.9s	remaining: 3.67s
165:	learn: 0.0147723	total: 16s	remaining: 3.57s
166:	learn: 0.0146484	total: 16.2s	remaining: 3.48s
167:	learn: 0.0145230	total: 16.3s	remaining: 3.4s
168:	learn: 0.0144211	total: 16.4s	remaining: 3.31s
169:	learn: 0.0143398	total: 16.6s	remaining: 3.22s
170:	learn: 0.0142441	total: 16.8s	remaining: 3.13s
171:	learn: 0.0141133	total: 16.9s	remaining: 3.05s
172:	learn: 0.0140433	total: 17.1s	remaining: 2.96s
173:	learn: 0.0139280	total: 17.3s	remaining: 2.88s
174:	learn: 0.0138253	total: 17.4s	remaining: 2.79s
175:	learn: 0.0137282	total: 17.6s	remaining: 2.69s
176:	learn: 0.0136154	total: 17.7s	remaining: 2.6s
177:	learn: 0.0135464	total: 17.9s	remaining: 2.51s
178:	learn: 0.0134737	total: 18s	remaining: 2.41s
179:	learn: 0.0134089	total: 18.2s	remaining: 2.32s
180:	learn: 0.0133244	total: 18.3s	remaining: 2.23s
181:	learn: 0.0132455	total: 18.5s	remaining: 2.13s
182:	learn: 0.0131715	total: 18.6s	remaining: 2.04s
183:	learn: 0.0130846	total: 18.8s	remaining: 1.94s
184:	learn: 0.0129541	total: 18.9s	remaining: 1.84s
185:	learn: 0.0129211	total: 19.1s	remaining: 1.74s
186:	learn: 0.0128762	total: 19.2s	remaining: 1.65s
187:	learn: 0.0127869	total: 19.4s	remaining: 1.54s
188:	learn: 0.0126638	total: 19.5s	remaining: 1.45s
189:	learn: 0.0126138	total: 19.7s	remaining: 1.34s
190:	learn: 0.0125398	total: 19.8s	remaining: 1.25s
191:	learn: 0.0124189	total: 20s	remaining: 1.14s
192:	learn: 0.0123697	total: 20.1s	remaining: 1.04s
193:	learn: 0.0123150	total: 20.3s	remaining: 940ms
194:	learn: 0.0122415	total: 20.4s	remaining: 838ms
195:	learn: 0.0121737	total: 20.6s	remaining: 736ms
196:	learn: 0.0120772	total: 20.7s	remaining: 630ms
197:	learn: 0.0120405	total: 20.8s	remaining: 524ms
198:	learn: 0.0119642	total: 20.9s	remaining: 419ms
199:	learn: 0.0118833	total: 20.9s	remaining: 314ms
200:	learn: 0.0117793	total: 21s	remaining: 209ms
201:	learn: 0.0117462	total: 21.1s	remaining: 104ms
202:	learn: 0.0116958	total: 21.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.87
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.91
 - F1-Score_Train: 98.92
 - Precision_Test: 6.65
 - Recall_Test: 92.86
 - AUPRC_Test: 70.92
 - Accuracy_Test: 97.80
 - F1-Score_Test: 12.41
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 203
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5960978	total: 74.5ms	remaining: 15.1s
1:	learn: 0.5047371	total: 150ms	remaining: 15.1s
2:	learn: 0.4531846	total: 220ms	remaining: 14.7s
3:	learn: 0.3908628	total: 305ms	remaining: 15.2s
4:	learn: 0.3403072	total: 384ms	remaining: 15.2s
5:	learn: 0.2913329	total: 466ms	remaining: 15.3s
6:	learn: 0.2611673	total: 554ms	remaining: 15.5s
7:	learn: 0.2326883	total: 632ms	remaining: 15.4s
8:	learn: 0.2111890	total: 711ms	remaining: 15.3s
9:	learn: 0.1909593	total: 796ms	remaining: 15.4s
10:	learn: 0.1768920	total: 867ms	remaining: 15.1s
11:	learn: 0.1629039	total: 942ms	remaining: 15s
12:	learn: 0.1477986	total: 1.04s	remaining: 15.3s
13:	learn: 0.1386118	total: 1.12s	remaining: 15.2s
14:	learn: 0.1300036	total: 1.2s	remaining: 15s
15:	learn: 0.1219238	total: 1.29s	remaining: 15s
16:	learn: 0.1154772	total: 1.37s	remaining: 15s
17:	learn: 0.1092750	total: 1.45s	remaining: 14.9s
18:	learn: 0.1047159	total: 1.54s	remaining: 14.9s
19:	learn: 0.0986676	total: 1.62s	remaining: 14.8s
20:	learn: 0.0947334	total: 1.7s	remaining: 14.7s
21:	learn: 0.0910941	total: 1.79s	remaining: 14.7s
22:	learn: 0.0872978	total: 1.87s	remaining: 14.7s
23:	learn: 0.0829031	total: 1.96s	remaining: 14.6s
24:	learn: 0.0802873	total: 2.06s	remaining: 14.7s
25:	learn: 0.0776909	total: 2.14s	remaining: 14.5s
26:	learn: 0.0757302	total: 2.21s	remaining: 14.4s
27:	learn: 0.0739621	total: 2.3s	remaining: 14.4s
28:	learn: 0.0725849	total: 2.38s	remaining: 14.3s
29:	learn: 0.0709219	total: 2.46s	remaining: 14.2s
30:	learn: 0.0689302	total: 2.55s	remaining: 14.2s
31:	learn: 0.0673966	total: 2.62s	remaining: 14s
32:	learn: 0.0658014	total: 2.71s	remaining: 13.9s
33:	learn: 0.0638069	total: 2.8s	remaining: 13.9s
34:	learn: 0.0620796	total: 2.88s	remaining: 13.8s
35:	learn: 0.0596324	total: 2.95s	remaining: 13.7s
36:	learn: 0.0587246	total: 3.04s	remaining: 13.6s
37:	learn: 0.0573606	total: 3.13s	remaining: 13.6s
38:	learn: 0.0560248	total: 3.21s	remaining: 13.5s
39:	learn: 0.0546210	total: 3.3s	remaining: 13.4s
40:	learn: 0.0538434	total: 3.38s	remaining: 13.3s
41:	learn: 0.0529113	total: 3.45s	remaining: 13.2s
42:	learn: 0.0511730	total: 3.54s	remaining: 13.2s
43:	learn: 0.0501212	total: 3.62s	remaining: 13.1s
44:	learn: 0.0494804	total: 3.69s	remaining: 13s
45:	learn: 0.0480877	total: 3.78s	remaining: 12.9s
46:	learn: 0.0472996	total: 3.86s	remaining: 12.8s
47:	learn: 0.0461464	total: 3.95s	remaining: 12.7s
48:	learn: 0.0455293	total: 4.03s	remaining: 12.7s
49:	learn: 0.0446540	total: 4.13s	remaining: 12.7s
50:	learn: 0.0436102	total: 4.22s	remaining: 12.6s
51:	learn: 0.0426896	total: 4.31s	remaining: 12.5s
52:	learn: 0.0418457	total: 4.39s	remaining: 12.4s
53:	learn: 0.0414003	total: 4.47s	remaining: 12.3s
54:	learn: 0.0410724	total: 4.55s	remaining: 12.2s
55:	learn: 0.0400555	total: 4.63s	remaining: 12.2s
56:	learn: 0.0396384	total: 4.7s	remaining: 12.1s
57:	learn: 0.0387521	total: 4.8s	remaining: 12s
58:	learn: 0.0383128	total: 4.87s	remaining: 11.9s
59:	learn: 0.0379473	total: 4.94s	remaining: 11.8s
60:	learn: 0.0373669	total: 5.03s	remaining: 11.7s
61:	learn: 0.0367657	total: 5.12s	remaining: 11.6s
62:	learn: 0.0362383	total: 5.21s	remaining: 11.6s
63:	learn: 0.0358221	total: 5.29s	remaining: 11.5s
64:	learn: 0.0355495	total: 5.37s	remaining: 11.4s
65:	learn: 0.0350184	total: 5.45s	remaining: 11.3s
66:	learn: 0.0345669	total: 5.54s	remaining: 11.2s
67:	learn: 0.0342354	total: 5.61s	remaining: 11.1s
68:	learn: 0.0336903	total: 5.7s	remaining: 11.1s
69:	learn: 0.0331952	total: 5.79s	remaining: 11s
70:	learn: 0.0328292	total: 5.87s	remaining: 10.9s
71:	learn: 0.0323872	total: 5.95s	remaining: 10.8s
72:	learn: 0.0320442	total: 6.03s	remaining: 10.7s
73:	learn: 0.0315607	total: 6.11s	remaining: 10.7s
74:	learn: 0.0313163	total: 6.2s	remaining: 10.6s
75:	learn: 0.0308644	total: 6.29s	remaining: 10.5s
76:	learn: 0.0304823	total: 6.39s	remaining: 10.5s
77:	learn: 0.0302140	total: 6.46s	remaining: 10.4s
78:	learn: 0.0299231	total: 6.55s	remaining: 10.3s
79:	learn: 0.0293627	total: 6.63s	remaining: 10.2s
80:	learn: 0.0288456	total: 6.71s	remaining: 10.1s
81:	learn: 0.0283897	total: 6.8s	remaining: 10s
82:	learn: 0.0278975	total: 6.88s	remaining: 9.95s
83:	learn: 0.0276105	total: 6.96s	remaining: 9.85s
84:	learn: 0.0272650	total: 7.04s	remaining: 9.78s
85:	learn: 0.0270363	total: 7.12s	remaining: 9.69s
86:	learn: 0.0268363	total: 7.21s	remaining: 9.61s
87:	learn: 0.0265193	total: 7.3s	remaining: 9.54s
88:	learn: 0.0263379	total: 7.42s	remaining: 9.5s
89:	learn: 0.0258581	total: 7.55s	remaining: 9.48s
90:	learn: 0.0255417	total: 7.69s	remaining: 9.46s
91:	learn: 0.0253493	total: 7.84s	remaining: 9.46s
92:	learn: 0.0250420	total: 8s	remaining: 9.46s
93:	learn: 0.0247635	total: 8.18s	remaining: 9.48s
94:	learn: 0.0245473	total: 8.34s	remaining: 9.48s
95:	learn: 0.0243322	total: 8.48s	remaining: 9.46s
96:	learn: 0.0240373	total: 8.64s	remaining: 9.45s
97:	learn: 0.0238736	total: 8.82s	remaining: 9.45s
98:	learn: 0.0235657	total: 8.97s	remaining: 9.42s
99:	learn: 0.0232629	total: 9.14s	remaining: 9.41s
100:	learn: 0.0228727	total: 9.3s	remaining: 9.39s
101:	learn: 0.0226968	total: 9.46s	remaining: 9.37s
102:	learn: 0.0225319	total: 9.61s	remaining: 9.33s
103:	learn: 0.0222126	total: 9.78s	remaining: 9.31s
104:	learn: 0.0220070	total: 9.94s	remaining: 9.27s
105:	learn: 0.0217011	total: 10.1s	remaining: 9.24s
106:	learn: 0.0215357	total: 10.3s	remaining: 9.2s
107:	learn: 0.0212269	total: 10.4s	remaining: 9.19s
108:	learn: 0.0209961	total: 10.6s	remaining: 9.14s
109:	learn: 0.0207860	total: 10.8s	remaining: 9.09s
110:	learn: 0.0205374	total: 10.9s	remaining: 9.04s
111:	learn: 0.0203404	total: 11s	remaining: 8.97s
112:	learn: 0.0201033	total: 11.2s	remaining: 8.92s
113:	learn: 0.0199168	total: 11.3s	remaining: 8.85s
114:	learn: 0.0197273	total: 11.5s	remaining: 8.81s
115:	learn: 0.0195684	total: 11.7s	remaining: 8.75s
116:	learn: 0.0193955	total: 11.8s	remaining: 8.69s
117:	learn: 0.0192221	total: 12s	remaining: 8.65s
118:	learn: 0.0190057	total: 12.1s	remaining: 8.54s
119:	learn: 0.0188041	total: 12.2s	remaining: 8.42s
120:	learn: 0.0186380	total: 12.3s	remaining: 8.3s
121:	learn: 0.0185283	total: 12.4s	remaining: 8.21s
122:	learn: 0.0183654	total: 12.5s	remaining: 8.11s
123:	learn: 0.0181939	total: 12.5s	remaining: 7.99s
124:	learn: 0.0180491	total: 12.6s	remaining: 7.88s
125:	learn: 0.0179155	total: 12.7s	remaining: 7.77s
126:	learn: 0.0178327	total: 12.8s	remaining: 7.65s
127:	learn: 0.0176357	total: 12.9s	remaining: 7.54s
128:	learn: 0.0175052	total: 13s	remaining: 7.43s
129:	learn: 0.0173546	total: 13s	remaining: 7.32s
130:	learn: 0.0172336	total: 13.1s	remaining: 7.21s
131:	learn: 0.0171143	total: 13.2s	remaining: 7.1s
132:	learn: 0.0169573	total: 13.3s	remaining: 6.99s
133:	learn: 0.0167480	total: 13.4s	remaining: 6.89s
134:	learn: 0.0165936	total: 13.5s	remaining: 6.8s
135:	learn: 0.0164139	total: 13.6s	remaining: 6.69s
136:	learn: 0.0162355	total: 13.7s	remaining: 6.58s
137:	learn: 0.0161207	total: 13.8s	remaining: 6.48s
138:	learn: 0.0160286	total: 13.8s	remaining: 6.36s
139:	learn: 0.0159232	total: 13.9s	remaining: 6.25s
140:	learn: 0.0157882	total: 14s	remaining: 6.15s
141:	learn: 0.0156751	total: 14.1s	remaining: 6.04s
142:	learn: 0.0155150	total: 14.1s	remaining: 5.93s
143:	learn: 0.0153902	total: 14.2s	remaining: 5.83s
144:	learn: 0.0152767	total: 14.3s	remaining: 5.72s
145:	learn: 0.0151711	total: 14.4s	remaining: 5.62s
146:	learn: 0.0150027	total: 14.5s	remaining: 5.52s
147:	learn: 0.0148923	total: 14.6s	remaining: 5.42s
148:	learn: 0.0148117	total: 14.6s	remaining: 5.31s
149:	learn: 0.0147420	total: 14.7s	remaining: 5.2s
150:	learn: 0.0146121	total: 14.8s	remaining: 5.1s
151:	learn: 0.0145269	total: 14.9s	remaining: 4.99s
152:	learn: 0.0144553	total: 15s	remaining: 4.89s
153:	learn: 0.0143938	total: 15s	remaining: 4.79s
154:	learn: 0.0142485	total: 15.1s	remaining: 4.69s
155:	learn: 0.0141588	total: 15.2s	remaining: 4.58s
156:	learn: 0.0140341	total: 15.3s	remaining: 4.48s
157:	learn: 0.0139837	total: 15.4s	remaining: 4.38s
158:	learn: 0.0139247	total: 15.5s	remaining: 4.28s
159:	learn: 0.0138051	total: 15.6s	remaining: 4.18s
160:	learn: 0.0137276	total: 15.6s	remaining: 4.08s
161:	learn: 0.0136045	total: 15.7s	remaining: 3.98s
162:	learn: 0.0134978	total: 15.8s	remaining: 3.88s
163:	learn: 0.0134174	total: 15.9s	remaining: 3.78s
164:	learn: 0.0133438	total: 16s	remaining: 3.68s
165:	learn: 0.0132242	total: 16.1s	remaining: 3.58s
166:	learn: 0.0131090	total: 16.1s	remaining: 3.48s
167:	learn: 0.0130571	total: 16.2s	remaining: 3.38s
168:	learn: 0.0129420	total: 16.3s	remaining: 3.28s
169:	learn: 0.0128796	total: 16.4s	remaining: 3.18s
170:	learn: 0.0128037	total: 16.5s	remaining: 3.08s
171:	learn: 0.0127002	total: 16.5s	remaining: 2.98s
172:	learn: 0.0126270	total: 16.6s	remaining: 2.88s
173:	learn: 0.0125193	total: 16.7s	remaining: 2.79s
174:	learn: 0.0123884	total: 16.8s	remaining: 2.69s
175:	learn: 0.0123437	total: 16.9s	remaining: 2.59s
176:	learn: 0.0122590	total: 17s	remaining: 2.49s
177:	learn: 0.0121607	total: 17s	remaining: 2.39s
178:	learn: 0.0120465	total: 17.1s	remaining: 2.3s
179:	learn: 0.0119556	total: 17.2s	remaining: 2.2s
180:	learn: 0.0118952	total: 17.3s	remaining: 2.1s
181:	learn: 0.0118026	total: 17.4s	remaining: 2s
182:	learn: 0.0116907	total: 17.5s	remaining: 1.91s
183:	learn: 0.0115758	total: 17.6s	remaining: 1.81s
184:	learn: 0.0115333	total: 17.7s	remaining: 1.72s
185:	learn: 0.0114512	total: 17.8s	remaining: 1.62s
186:	learn: 0.0113798	total: 17.8s	remaining: 1.53s
187:	learn: 0.0112815	total: 17.9s	remaining: 1.43s
188:	learn: 0.0112439	total: 18s	remaining: 1.33s
189:	learn: 0.0111647	total: 18.1s	remaining: 1.24s
190:	learn: 0.0110910	total: 18.2s	remaining: 1.14s
191:	learn: 0.0110540	total: 18.2s	remaining: 1.04s
192:	learn: 0.0110252	total: 18.3s	remaining: 949ms
193:	learn: 0.0109489	total: 18.4s	remaining: 853ms
194:	learn: 0.0108762	total: 18.5s	remaining: 758ms
195:	learn: 0.0108002	total: 18.6s	remaining: 663ms
196:	learn: 0.0107086	total: 18.6s	remaining: 568ms
197:	learn: 0.0106739	total: 18.7s	remaining: 473ms
198:	learn: 0.0105914	total: 18.8s	remaining: 378ms
199:	learn: 0.0104961	total: 18.9s	remaining: 283ms
200:	learn: 0.0104354	total: 19s	remaining: 189ms
201:	learn: 0.0103575	total: 19s	remaining: 94.3ms
202:	learn: 0.0102569	total: 19.1s	remaining: 0us
[I 2024-12-19 14:09:55,420] Trial 2 finished with value: 70.54737699484406 and parameters: {'learning_rate': 0.03777341007132233, 'max_depth': 5, 'n_estimators': 203, 'scale_pos_weight': 9.790762307668434}. Best is trial 0 with value: 71.43498362547278.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.93
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 98.95
 - F1-Score_Train: 98.96
 - Precision_Test: 6.46
 - Recall_Test: 88.89
 - AUPRC_Test: 72.27
 - Accuracy_Test: 97.81
 - F1-Score_Test: 12.04
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 203
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 70.5474

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6611628	total: 94.4ms	remaining: 27.4s
1:	learn: 0.6293344	total: 180ms	remaining: 26s
2:	learn: 0.6010530	total: 271ms	remaining: 26s
3:	learn: 0.5667043	total: 376ms	remaining: 26.9s
4:	learn: 0.5324599	total: 486ms	remaining: 27.8s
5:	learn: 0.5074492	total: 574ms	remaining: 27.3s
6:	learn: 0.4807116	total: 685ms	remaining: 27.8s
7:	learn: 0.4577233	total: 811ms	remaining: 28.7s
8:	learn: 0.4322893	total: 958ms	remaining: 30s
9:	learn: 0.4095164	total: 1.14s	remaining: 32.1s
10:	learn: 0.3924173	total: 1.35s	remaining: 34.4s
11:	learn: 0.3720512	total: 1.57s	remaining: 36.6s
12:	learn: 0.3546234	total: 1.76s	remaining: 37.7s
13:	learn: 0.3367648	total: 1.96s	remaining: 38.8s
14:	learn: 0.3200748	total: 2.15s	remaining: 39.6s
15:	learn: 0.3053743	total: 2.35s	remaining: 40.3s
16:	learn: 0.2892839	total: 2.54s	remaining: 40.9s
17:	learn: 0.2757444	total: 2.74s	remaining: 41.5s
18:	learn: 0.2608822	total: 2.92s	remaining: 41.9s
19:	learn: 0.2456782	total: 3.11s	remaining: 42.2s
20:	learn: 0.2334238	total: 3.3s	remaining: 42.4s
21:	learn: 0.2270182	total: 3.48s	remaining: 42.6s
22:	learn: 0.2160610	total: 3.69s	remaining: 43s
23:	learn: 0.2049815	total: 3.91s	remaining: 43.5s
24:	learn: 0.1974787	total: 4.11s	remaining: 43.7s
25:	learn: 0.1883251	total: 4.31s	remaining: 43.9s
26:	learn: 0.1792493	total: 4.52s	remaining: 44.2s
27:	learn: 0.1727250	total: 4.7s	remaining: 44.1s
28:	learn: 0.1677083	total: 4.92s	remaining: 44.4s
29:	learn: 0.1600930	total: 5.12s	remaining: 44.6s
30:	learn: 0.1554456	total: 5.3s	remaining: 44.5s
31:	learn: 0.1514981	total: 5.48s	remaining: 44.3s
32:	learn: 0.1446678	total: 5.67s	remaining: 44.4s
33:	learn: 0.1390408	total: 5.85s	remaining: 44.2s
34:	learn: 0.1341434	total: 6.05s	remaining: 44.2s
35:	learn: 0.1283683	total: 6.25s	remaining: 44.3s
36:	learn: 0.1232853	total: 6.44s	remaining: 44.2s
37:	learn: 0.1189639	total: 6.64s	remaining: 44.2s
38:	learn: 0.1153966	total: 6.85s	remaining: 44.2s
39:	learn: 0.1131295	total: 7.05s	remaining: 44.3s
40:	learn: 0.1111661	total: 7.26s	remaining: 44.2s
41:	learn: 0.1086992	total: 7.44s	remaining: 44.1s
42:	learn: 0.1062556	total: 7.62s	remaining: 44s
43:	learn: 0.1032814	total: 7.81s	remaining: 43.9s
44:	learn: 0.1011953	total: 8s	remaining: 43.7s
45:	learn: 0.0976716	total: 8.18s	remaining: 43.6s
46:	learn: 0.0948669	total: 8.38s	remaining: 43.5s
47:	learn: 0.0925181	total: 8.55s	remaining: 43.3s
48:	learn: 0.0895610	total: 8.74s	remaining: 43.2s
49:	learn: 0.0866042	total: 8.92s	remaining: 43s
50:	learn: 0.0849513	total: 9.09s	remaining: 42.8s
51:	learn: 0.0827736	total: 9.28s	remaining: 42.7s
52:	learn: 0.0806382	total: 9.46s	remaining: 42.5s
53:	learn: 0.0788686	total: 9.63s	remaining: 42.3s
54:	learn: 0.0774423	total: 9.81s	remaining: 42.1s
55:	learn: 0.0752698	total: 10s	remaining: 42s
56:	learn: 0.0733652	total: 10.2s	remaining: 41.8s
57:	learn: 0.0714405	total: 10.4s	remaining: 41.6s
58:	learn: 0.0698399	total: 10.5s	remaining: 41.3s
59:	learn: 0.0682254	total: 10.6s	remaining: 40.8s
60:	learn: 0.0673220	total: 10.7s	remaining: 40.3s
61:	learn: 0.0659712	total: 10.8s	remaining: 39.9s
62:	learn: 0.0647344	total: 10.9s	remaining: 39.4s
63:	learn: 0.0632616	total: 11s	remaining: 38.9s
64:	learn: 0.0618684	total: 11.1s	remaining: 38.5s
65:	learn: 0.0605313	total: 11.2s	remaining: 38.1s
66:	learn: 0.0596939	total: 11.3s	remaining: 37.8s
67:	learn: 0.0586299	total: 11.4s	remaining: 37.4s
68:	learn: 0.0578852	total: 11.5s	remaining: 37s
69:	learn: 0.0568389	total: 11.6s	remaining: 36.6s
70:	learn: 0.0559456	total: 11.7s	remaining: 36.2s
71:	learn: 0.0548866	total: 11.8s	remaining: 35.8s
72:	learn: 0.0538786	total: 11.9s	remaining: 35.5s
73:	learn: 0.0531425	total: 12s	remaining: 35.1s
74:	learn: 0.0521339	total: 12.1s	remaining: 34.8s
75:	learn: 0.0511446	total: 12.2s	remaining: 34.4s
76:	learn: 0.0502881	total: 12.3s	remaining: 34.1s
77:	learn: 0.0498036	total: 12.4s	remaining: 33.8s
78:	learn: 0.0490960	total: 12.5s	remaining: 33.5s
79:	learn: 0.0482562	total: 12.6s	remaining: 33.1s
80:	learn: 0.0475373	total: 12.6s	remaining: 32.8s
81:	learn: 0.0468769	total: 12.7s	remaining: 32.5s
82:	learn: 0.0460376	total: 12.8s	remaining: 32.1s
83:	learn: 0.0453615	total: 12.9s	remaining: 31.8s
84:	learn: 0.0446983	total: 13s	remaining: 31.5s
85:	learn: 0.0440765	total: 13.1s	remaining: 31.3s
86:	learn: 0.0434497	total: 13.2s	remaining: 31s
87:	learn: 0.0427946	total: 13.3s	remaining: 30.8s
88:	learn: 0.0423380	total: 13.4s	remaining: 30.4s
89:	learn: 0.0416565	total: 13.5s	remaining: 30.2s
90:	learn: 0.0410268	total: 13.6s	remaining: 29.9s
91:	learn: 0.0405704	total: 13.7s	remaining: 29.6s
92:	learn: 0.0401760	total: 13.8s	remaining: 29.4s
93:	learn: 0.0396321	total: 13.9s	remaining: 29.1s
94:	learn: 0.0393487	total: 14s	remaining: 28.8s
95:	learn: 0.0389162	total: 14.1s	remaining: 28.6s
96:	learn: 0.0386224	total: 14.2s	remaining: 28.3s
97:	learn: 0.0382494	total: 14.3s	remaining: 28.1s
98:	learn: 0.0378792	total: 14.4s	remaining: 27.9s
99:	learn: 0.0375645	total: 14.5s	remaining: 27.6s
100:	learn: 0.0371257	total: 14.6s	remaining: 27.4s
101:	learn: 0.0367047	total: 14.7s	remaining: 27.2s
102:	learn: 0.0363526	total: 14.8s	remaining: 26.9s
103:	learn: 0.0360122	total: 14.8s	remaining: 26.7s
104:	learn: 0.0357519	total: 14.9s	remaining: 26.5s
105:	learn: 0.0353612	total: 15s	remaining: 26.2s
106:	learn: 0.0349856	total: 15.1s	remaining: 26s
107:	learn: 0.0347562	total: 15.2s	remaining: 25.8s
108:	learn: 0.0344228	total: 15.3s	remaining: 25.6s
109:	learn: 0.0341207	total: 15.4s	remaining: 25.4s
110:	learn: 0.0338606	total: 15.5s	remaining: 25.2s
111:	learn: 0.0335138	total: 15.6s	remaining: 24.9s
112:	learn: 0.0331622	total: 15.7s	remaining: 24.7s
113:	learn: 0.0329211	total: 15.8s	remaining: 24.5s
114:	learn: 0.0325961	total: 15.9s	remaining: 24.3s
115:	learn: 0.0324169	total: 16s	remaining: 24.1s
116:	learn: 0.0321645	total: 16.1s	remaining: 23.9s
117:	learn: 0.0319084	total: 16.2s	remaining: 23.7s
118:	learn: 0.0316613	total: 16.3s	remaining: 23.5s
119:	learn: 0.0314238	total: 16.4s	remaining: 23.3s
120:	learn: 0.0312115	total: 16.5s	remaining: 23.2s
121:	learn: 0.0309148	total: 16.6s	remaining: 23s
122:	learn: 0.0307182	total: 16.7s	remaining: 22.8s
123:	learn: 0.0304744	total: 16.8s	remaining: 22.6s
124:	learn: 0.0302624	total: 16.9s	remaining: 22.4s
125:	learn: 0.0300129	total: 16.9s	remaining: 22.2s
126:	learn: 0.0297912	total: 17s	remaining: 22s
127:	learn: 0.0294906	total: 17.1s	remaining: 21.8s
128:	learn: 0.0291926	total: 17.2s	remaining: 21.6s
129:	learn: 0.0289473	total: 17.3s	remaining: 21.4s
130:	learn: 0.0287210	total: 17.4s	remaining: 21.3s
131:	learn: 0.0285309	total: 17.5s	remaining: 21.1s
132:	learn: 0.0283243	total: 17.6s	remaining: 20.9s
133:	learn: 0.0281826	total: 17.7s	remaining: 20.7s
134:	learn: 0.0279893	total: 17.8s	remaining: 20.5s
135:	learn: 0.0277236	total: 17.9s	remaining: 20.4s
136:	learn: 0.0275364	total: 18s	remaining: 20.2s
137:	learn: 0.0273695	total: 18.1s	remaining: 20s
138:	learn: 0.0272146	total: 18.1s	remaining: 19.8s
139:	learn: 0.0269505	total: 18.3s	remaining: 19.7s
140:	learn: 0.0266905	total: 18.4s	remaining: 19.5s
141:	learn: 0.0265112	total: 18.4s	remaining: 19.3s
142:	learn: 0.0263584	total: 18.5s	remaining: 19.2s
143:	learn: 0.0262114	total: 18.6s	remaining: 19s
144:	learn: 0.0260077	total: 18.7s	remaining: 18.9s
145:	learn: 0.0259090	total: 18.8s	remaining: 18.7s
146:	learn: 0.0257335	total: 18.9s	remaining: 18.5s
147:	learn: 0.0255339	total: 19s	remaining: 18.4s
148:	learn: 0.0253183	total: 19.1s	remaining: 18.2s
149:	learn: 0.0250930	total: 19.2s	remaining: 18s
150:	learn: 0.0249577	total: 19.3s	remaining: 17.9s
151:	learn: 0.0248484	total: 19.4s	remaining: 17.7s
152:	learn: 0.0247013	total: 19.5s	remaining: 17.6s
153:	learn: 0.0245815	total: 19.6s	remaining: 17.4s
154:	learn: 0.0244368	total: 19.7s	remaining: 17.3s
155:	learn: 0.0242901	total: 19.8s	remaining: 17.1s
156:	learn: 0.0241034	total: 19.8s	remaining: 16.9s
157:	learn: 0.0238998	total: 19.9s	remaining: 16.8s
158:	learn: 0.0237218	total: 20s	remaining: 16.6s
159:	learn: 0.0235654	total: 20.1s	remaining: 16.5s
160:	learn: 0.0234006	total: 20.2s	remaining: 16.3s
161:	learn: 0.0231798	total: 20.3s	remaining: 16.2s
162:	learn: 0.0230444	total: 20.4s	remaining: 16s
163:	learn: 0.0229560	total: 20.6s	remaining: 16s
164:	learn: 0.0227919	total: 20.8s	remaining: 15.9s
165:	learn: 0.0226346	total: 21s	remaining: 15.8s
166:	learn: 0.0224424	total: 21.1s	remaining: 15.7s
167:	learn: 0.0223279	total: 21.3s	remaining: 15.6s
168:	learn: 0.0221727	total: 21.5s	remaining: 15.5s
169:	learn: 0.0219986	total: 21.7s	remaining: 15.4s
170:	learn: 0.0218449	total: 21.9s	remaining: 15.3s
171:	learn: 0.0216967	total: 22s	remaining: 15.2s
172:	learn: 0.0215629	total: 22.2s	remaining: 15.1s
173:	learn: 0.0214332	total: 22.4s	remaining: 15s
174:	learn: 0.0212530	total: 22.6s	remaining: 15s
175:	learn: 0.0211038	total: 22.7s	remaining: 14.9s
176:	learn: 0.0209784	total: 22.9s	remaining: 14.8s
177:	learn: 0.0208184	total: 23.1s	remaining: 14.7s
178:	learn: 0.0206807	total: 23.3s	remaining: 14.6s
179:	learn: 0.0205396	total: 23.5s	remaining: 14.5s
180:	learn: 0.0204247	total: 23.6s	remaining: 14.4s
181:	learn: 0.0203533	total: 23.8s	remaining: 14.3s
182:	learn: 0.0202321	total: 24s	remaining: 14.2s
183:	learn: 0.0201150	total: 24.1s	remaining: 14s
184:	learn: 0.0200139	total: 24.3s	remaining: 13.9s
185:	learn: 0.0199129	total: 24.5s	remaining: 13.8s
186:	learn: 0.0197972	total: 24.7s	remaining: 13.7s
187:	learn: 0.0197004	total: 24.8s	remaining: 13.6s
188:	learn: 0.0196225	total: 25s	remaining: 13.5s
189:	learn: 0.0195079	total: 25.2s	remaining: 13.4s
190:	learn: 0.0194154	total: 25.4s	remaining: 13.3s
191:	learn: 0.0193098	total: 25.5s	remaining: 13.2s
192:	learn: 0.0192540	total: 25.7s	remaining: 13.1s
193:	learn: 0.0191163	total: 25.9s	remaining: 12.9s
194:	learn: 0.0190118	total: 26s	remaining: 12.8s
195:	learn: 0.0189187	total: 26s	remaining: 12.6s
196:	learn: 0.0188192	total: 26.1s	remaining: 12.5s
197:	learn: 0.0187236	total: 26.2s	remaining: 12.3s
198:	learn: 0.0186666	total: 26.3s	remaining: 12.2s
199:	learn: 0.0185800	total: 26.4s	remaining: 12s
200:	learn: 0.0185083	total: 26.5s	remaining: 11.9s
201:	learn: 0.0184010	total: 26.6s	remaining: 11.7s
202:	learn: 0.0183297	total: 26.7s	remaining: 11.6s
203:	learn: 0.0182722	total: 26.8s	remaining: 11.4s
204:	learn: 0.0181769	total: 26.9s	remaining: 11.3s
205:	learn: 0.0180712	total: 27s	remaining: 11.1s
206:	learn: 0.0179869	total: 27.1s	remaining: 11s
207:	learn: 0.0178887	total: 27.1s	remaining: 10.8s
208:	learn: 0.0178138	total: 27.2s	remaining: 10.7s
209:	learn: 0.0177329	total: 27.3s	remaining: 10.5s
210:	learn: 0.0176506	total: 27.4s	remaining: 10.4s
211:	learn: 0.0175659	total: 27.5s	remaining: 10.2s
212:	learn: 0.0174973	total: 27.6s	remaining: 10.1s
213:	learn: 0.0174371	total: 27.7s	remaining: 9.96s
214:	learn: 0.0173255	total: 27.8s	remaining: 9.81s
215:	learn: 0.0172345	total: 27.9s	remaining: 9.67s
216:	learn: 0.0171837	total: 28s	remaining: 9.53s
217:	learn: 0.0170997	total: 28s	remaining: 9.39s
218:	learn: 0.0170329	total: 28.1s	remaining: 9.24s
219:	learn: 0.0169653	total: 28.2s	remaining: 9.11s
220:	learn: 0.0168856	total: 28.3s	remaining: 8.97s
221:	learn: 0.0168359	total: 28.4s	remaining: 8.83s
222:	learn: 0.0167622	total: 28.5s	remaining: 8.69s
223:	learn: 0.0166803	total: 28.6s	remaining: 8.55s
224:	learn: 0.0166145	total: 28.7s	remaining: 8.41s
225:	learn: 0.0165663	total: 28.8s	remaining: 8.27s
226:	learn: 0.0164982	total: 28.8s	remaining: 8.13s
227:	learn: 0.0163868	total: 29s	remaining: 8s
228:	learn: 0.0163346	total: 29s	remaining: 7.86s
229:	learn: 0.0162824	total: 29.1s	remaining: 7.72s
230:	learn: 0.0162112	total: 29.2s	remaining: 7.59s
231:	learn: 0.0161330	total: 29.3s	remaining: 7.45s
232:	learn: 0.0160752	total: 29.4s	remaining: 7.32s
233:	learn: 0.0159749	total: 29.5s	remaining: 7.19s
234:	learn: 0.0158706	total: 29.6s	remaining: 7.05s
235:	learn: 0.0157941	total: 29.7s	remaining: 6.92s
236:	learn: 0.0157048	total: 29.8s	remaining: 6.78s
237:	learn: 0.0155978	total: 29.9s	remaining: 6.65s
238:	learn: 0.0155055	total: 30s	remaining: 6.53s
239:	learn: 0.0154322	total: 30.1s	remaining: 6.39s
240:	learn: 0.0153847	total: 30.2s	remaining: 6.26s
241:	learn: 0.0153300	total: 30.3s	remaining: 6.13s
242:	learn: 0.0152622	total: 30.4s	remaining: 6s
243:	learn: 0.0151854	total: 30.4s	remaining: 5.86s
244:	learn: 0.0151301	total: 30.5s	remaining: 5.73s
245:	learn: 0.0150838	total: 30.6s	remaining: 5.6s
246:	learn: 0.0150095	total: 30.7s	remaining: 5.47s
247:	learn: 0.0149206	total: 30.8s	remaining: 5.34s
248:	learn: 0.0148757	total: 30.9s	remaining: 5.21s
249:	learn: 0.0148372	total: 31s	remaining: 5.08s
250:	learn: 0.0147636	total: 31.1s	remaining: 4.95s
251:	learn: 0.0146839	total: 31.2s	remaining: 4.83s
252:	learn: 0.0146202	total: 31.3s	remaining: 4.7s
253:	learn: 0.0145914	total: 31.4s	remaining: 4.57s
254:	learn: 0.0145132	total: 31.5s	remaining: 4.44s
255:	learn: 0.0144718	total: 31.5s	remaining: 4.31s
256:	learn: 0.0144242	total: 31.6s	remaining: 4.18s
257:	learn: 0.0143885	total: 31.7s	remaining: 4.06s
258:	learn: 0.0143433	total: 31.8s	remaining: 3.93s
259:	learn: 0.0142908	total: 31.9s	remaining: 3.8s
260:	learn: 0.0142267	total: 32s	remaining: 3.68s
261:	learn: 0.0141396	total: 32.1s	remaining: 3.55s
262:	learn: 0.0140843	total: 32.2s	remaining: 3.43s
263:	learn: 0.0140404	total: 32.3s	remaining: 3.3s
264:	learn: 0.0139713	total: 32.4s	remaining: 3.18s
265:	learn: 0.0139308	total: 32.5s	remaining: 3.05s
266:	learn: 0.0138519	total: 32.6s	remaining: 2.93s
267:	learn: 0.0137829	total: 32.7s	remaining: 2.8s
268:	learn: 0.0137068	total: 32.8s	remaining: 2.68s
269:	learn: 0.0136392	total: 32.8s	remaining: 2.55s
270:	learn: 0.0135848	total: 32.9s	remaining: 2.43s
271:	learn: 0.0135265	total: 33.1s	remaining: 2.31s
272:	learn: 0.0134713	total: 33.1s	remaining: 2.18s
273:	learn: 0.0134229	total: 33.2s	remaining: 2.06s
274:	learn: 0.0133586	total: 33.3s	remaining: 1.94s
275:	learn: 0.0133095	total: 33.4s	remaining: 1.81s
276:	learn: 0.0132484	total: 33.5s	remaining: 1.69s
277:	learn: 0.0131720	total: 33.6s	remaining: 1.57s
278:	learn: 0.0131193	total: 33.7s	remaining: 1.45s
279:	learn: 0.0130665	total: 33.8s	remaining: 1.33s
280:	learn: 0.0130112	total: 33.9s	remaining: 1.21s
281:	learn: 0.0129394	total: 34s	remaining: 1.08s
282:	learn: 0.0128752	total: 34.1s	remaining: 963ms
283:	learn: 0.0128377	total: 34.2s	remaining: 842ms
284:	learn: 0.0127940	total: 34.3s	remaining: 721ms
285:	learn: 0.0127467	total: 34.4s	remaining: 601ms
286:	learn: 0.0127095	total: 34.4s	remaining: 480ms
287:	learn: 0.0126729	total: 34.5s	remaining: 360ms
288:	learn: 0.0126166	total: 34.6s	remaining: 240ms
289:	learn: 0.0125653	total: 34.7s	remaining: 120ms
290:	learn: 0.0124985	total: 34.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.25
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.05
 - F1-Score_Train: 98.09
 - Precision_Test: 3.70
 - Recall_Test: 91.27
 - AUPRC_Test: 71.92
 - Accuracy_Test: 95.99
 - F1-Score_Test: 7.12
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6594691	total: 157ms	remaining: 45.7s
1:	learn: 0.6275724	total: 333ms	remaining: 48.1s
2:	learn: 0.6012532	total: 506ms	remaining: 48.5s
3:	learn: 0.5737122	total: 685ms	remaining: 49.1s
4:	learn: 0.5458929	total: 876ms	remaining: 50.1s
5:	learn: 0.5172141	total: 1.05s	remaining: 50s
6:	learn: 0.4929090	total: 1.25s	remaining: 50.7s
7:	learn: 0.4700129	total: 1.43s	remaining: 50.7s
8:	learn: 0.4470142	total: 1.6s	remaining: 50.1s
9:	learn: 0.4271441	total: 1.77s	remaining: 49.8s
10:	learn: 0.4038226	total: 1.93s	remaining: 49.2s
11:	learn: 0.3868328	total: 2.1s	remaining: 48.9s
12:	learn: 0.3702704	total: 2.28s	remaining: 48.7s
13:	learn: 0.3532439	total: 2.45s	remaining: 48.5s
14:	learn: 0.3418924	total: 2.61s	remaining: 48.1s
15:	learn: 0.3269786	total: 2.78s	remaining: 47.8s
16:	learn: 0.3161117	total: 2.94s	remaining: 47.5s
17:	learn: 0.3056838	total: 3.1s	remaining: 47s
18:	learn: 0.2897393	total: 3.21s	remaining: 46s
19:	learn: 0.2766839	total: 3.3s	remaining: 44.7s
20:	learn: 0.2661830	total: 3.39s	remaining: 43.6s
21:	learn: 0.2554248	total: 3.49s	remaining: 42.6s
22:	learn: 0.2454256	total: 3.57s	remaining: 41.6s
23:	learn: 0.2356404	total: 3.67s	remaining: 40.9s
24:	learn: 0.2288003	total: 3.78s	remaining: 40.2s
25:	learn: 0.2200501	total: 3.87s	remaining: 39.5s
26:	learn: 0.2158586	total: 3.96s	remaining: 38.7s
27:	learn: 0.2089227	total: 4.05s	remaining: 38.1s
28:	learn: 0.2046542	total: 4.13s	remaining: 37.3s
29:	learn: 0.1976188	total: 4.24s	remaining: 36.9s
30:	learn: 0.1920100	total: 4.32s	remaining: 36.2s
31:	learn: 0.1856835	total: 4.41s	remaining: 35.7s
32:	learn: 0.1807396	total: 4.5s	remaining: 35.2s
33:	learn: 0.1775385	total: 4.59s	remaining: 34.7s
34:	learn: 0.1712278	total: 4.68s	remaining: 34.2s
35:	learn: 0.1664363	total: 4.79s	remaining: 33.9s
36:	learn: 0.1616989	total: 4.88s	remaining: 33.5s
37:	learn: 0.1569252	total: 4.97s	remaining: 33.1s
38:	learn: 0.1520613	total: 5.06s	remaining: 32.7s
39:	learn: 0.1474674	total: 5.17s	remaining: 32.4s
40:	learn: 0.1434481	total: 5.26s	remaining: 32.1s
41:	learn: 0.1395520	total: 5.34s	remaining: 31.7s
42:	learn: 0.1358310	total: 5.43s	remaining: 31.3s
43:	learn: 0.1323647	total: 5.52s	remaining: 31s
44:	learn: 0.1288423	total: 5.62s	remaining: 30.7s
45:	learn: 0.1255718	total: 5.72s	remaining: 30.5s
46:	learn: 0.1219924	total: 5.82s	remaining: 30.2s
47:	learn: 0.1197352	total: 5.92s	remaining: 30s
48:	learn: 0.1179962	total: 6.01s	remaining: 29.7s
49:	learn: 0.1150253	total: 6.1s	remaining: 29.4s
50:	learn: 0.1125603	total: 6.19s	remaining: 29.1s
51:	learn: 0.1099557	total: 6.29s	remaining: 28.9s
52:	learn: 0.1077403	total: 6.38s	remaining: 28.6s
53:	learn: 0.1050383	total: 6.46s	remaining: 28.4s
54:	learn: 0.1034844	total: 6.57s	remaining: 28.2s
55:	learn: 0.1009683	total: 6.66s	remaining: 27.9s
56:	learn: 0.0989551	total: 6.75s	remaining: 27.7s
57:	learn: 0.0972601	total: 6.85s	remaining: 27.5s
58:	learn: 0.0954475	total: 6.94s	remaining: 27.3s
59:	learn: 0.0940474	total: 7.03s	remaining: 27.1s
60:	learn: 0.0919427	total: 7.13s	remaining: 26.9s
61:	learn: 0.0904800	total: 7.23s	remaining: 26.7s
62:	learn: 0.0881906	total: 7.32s	remaining: 26.5s
63:	learn: 0.0865792	total: 7.43s	remaining: 26.3s
64:	learn: 0.0846902	total: 7.52s	remaining: 26.1s
65:	learn: 0.0828672	total: 7.62s	remaining: 26s
66:	learn: 0.0819198	total: 7.71s	remaining: 25.8s
67:	learn: 0.0804487	total: 7.8s	remaining: 25.6s
68:	learn: 0.0794876	total: 7.91s	remaining: 25.4s
69:	learn: 0.0783719	total: 8s	remaining: 25.3s
70:	learn: 0.0766802	total: 8.1s	remaining: 25.1s
71:	learn: 0.0752605	total: 8.21s	remaining: 25s
72:	learn: 0.0737901	total: 8.31s	remaining: 24.8s
73:	learn: 0.0725811	total: 8.4s	remaining: 24.6s
74:	learn: 0.0715194	total: 8.49s	remaining: 24.5s
75:	learn: 0.0701712	total: 8.59s	remaining: 24.3s
76:	learn: 0.0690710	total: 8.69s	remaining: 24.1s
77:	learn: 0.0678943	total: 8.8s	remaining: 24s
78:	learn: 0.0671488	total: 8.9s	remaining: 23.9s
79:	learn: 0.0665177	total: 8.99s	remaining: 23.7s
80:	learn: 0.0659380	total: 9.08s	remaining: 23.5s
81:	learn: 0.0650390	total: 9.17s	remaining: 23.4s
82:	learn: 0.0642630	total: 9.28s	remaining: 23.2s
83:	learn: 0.0634777	total: 9.38s	remaining: 23.1s
84:	learn: 0.0630122	total: 9.47s	remaining: 22.9s
85:	learn: 0.0622138	total: 9.57s	remaining: 22.8s
86:	learn: 0.0614550	total: 9.67s	remaining: 22.7s
87:	learn: 0.0608338	total: 9.77s	remaining: 22.5s
88:	learn: 0.0601224	total: 9.86s	remaining: 22.4s
89:	learn: 0.0594851	total: 9.97s	remaining: 22.3s
90:	learn: 0.0588055	total: 10.1s	remaining: 22.1s
91:	learn: 0.0580016	total: 10.2s	remaining: 22s
92:	learn: 0.0572389	total: 10.3s	remaining: 21.9s
93:	learn: 0.0564402	total: 10.4s	remaining: 21.7s
94:	learn: 0.0557401	total: 10.5s	remaining: 21.6s
95:	learn: 0.0552228	total: 10.6s	remaining: 21.4s
96:	learn: 0.0545733	total: 10.6s	remaining: 21.3s
97:	learn: 0.0537594	total: 10.7s	remaining: 21.2s
98:	learn: 0.0532549	total: 10.8s	remaining: 21s
99:	learn: 0.0527750	total: 10.9s	remaining: 20.9s
100:	learn: 0.0522981	total: 11s	remaining: 20.8s
101:	learn: 0.0517355	total: 11.1s	remaining: 20.6s
102:	learn: 0.0512957	total: 11.2s	remaining: 20.5s
103:	learn: 0.0507837	total: 11.3s	remaining: 20.4s
104:	learn: 0.0503474	total: 11.4s	remaining: 20.2s
105:	learn: 0.0499711	total: 11.5s	remaining: 20.1s
106:	learn: 0.0495831	total: 11.6s	remaining: 20s
107:	learn: 0.0493289	total: 11.7s	remaining: 19.8s
108:	learn: 0.0490261	total: 11.8s	remaining: 19.7s
109:	learn: 0.0486288	total: 11.9s	remaining: 19.6s
110:	learn: 0.0482793	total: 12s	remaining: 19.5s
111:	learn: 0.0478620	total: 12.1s	remaining: 19.3s
112:	learn: 0.0473585	total: 12.2s	remaining: 19.2s
113:	learn: 0.0468950	total: 12.3s	remaining: 19.1s
114:	learn: 0.0464139	total: 12.4s	remaining: 19s
115:	learn: 0.0459991	total: 12.5s	remaining: 18.8s
116:	learn: 0.0457630	total: 12.6s	remaining: 18.7s
117:	learn: 0.0453622	total: 12.7s	remaining: 18.6s
118:	learn: 0.0450711	total: 12.8s	remaining: 18.4s
119:	learn: 0.0447891	total: 12.9s	remaining: 18.3s
120:	learn: 0.0445348	total: 13s	remaining: 18.2s
121:	learn: 0.0441572	total: 13.1s	remaining: 18.1s
122:	learn: 0.0436840	total: 13.2s	remaining: 18.1s
123:	learn: 0.0434702	total: 13.4s	remaining: 18.1s
124:	learn: 0.0431136	total: 13.6s	remaining: 18s
125:	learn: 0.0427236	total: 13.8s	remaining: 18s
126:	learn: 0.0424490	total: 13.9s	remaining: 18s
127:	learn: 0.0421411	total: 14.1s	remaining: 18s
128:	learn: 0.0419233	total: 14.3s	remaining: 17.9s
129:	learn: 0.0416816	total: 14.4s	remaining: 17.9s
130:	learn: 0.0411869	total: 14.6s	remaining: 17.8s
131:	learn: 0.0408565	total: 14.8s	remaining: 17.8s
132:	learn: 0.0406402	total: 14.9s	remaining: 17.7s
133:	learn: 0.0404387	total: 15.1s	remaining: 17.7s
134:	learn: 0.0401940	total: 15.2s	remaining: 17.6s
135:	learn: 0.0399058	total: 15.4s	remaining: 17.6s
136:	learn: 0.0396304	total: 15.6s	remaining: 17.5s
137:	learn: 0.0393333	total: 15.8s	remaining: 17.5s
138:	learn: 0.0390597	total: 16s	remaining: 17.4s
139:	learn: 0.0388478	total: 16.1s	remaining: 17.4s
140:	learn: 0.0384874	total: 16.3s	remaining: 17.4s
141:	learn: 0.0381802	total: 16.5s	remaining: 17.3s
142:	learn: 0.0379357	total: 16.7s	remaining: 17.3s
143:	learn: 0.0376081	total: 16.8s	remaining: 17.2s
144:	learn: 0.0373371	total: 17s	remaining: 17.1s
145:	learn: 0.0370520	total: 17.2s	remaining: 17.1s
146:	learn: 0.0368311	total: 17.4s	remaining: 17s
147:	learn: 0.0365843	total: 17.6s	remaining: 17s
148:	learn: 0.0362837	total: 17.8s	remaining: 16.9s
149:	learn: 0.0359883	total: 17.9s	remaining: 16.9s
150:	learn: 0.0357635	total: 18.1s	remaining: 16.8s
151:	learn: 0.0355506	total: 18.3s	remaining: 16.7s
152:	learn: 0.0352977	total: 18.4s	remaining: 16.6s
153:	learn: 0.0350629	total: 18.5s	remaining: 16.5s
154:	learn: 0.0348023	total: 18.6s	remaining: 16.3s
155:	learn: 0.0344805	total: 18.7s	remaining: 16.2s
156:	learn: 0.0342998	total: 18.8s	remaining: 16s
157:	learn: 0.0341763	total: 18.9s	remaining: 15.9s
158:	learn: 0.0339107	total: 19s	remaining: 15.8s
159:	learn: 0.0336564	total: 19.1s	remaining: 15.6s
160:	learn: 0.0334715	total: 19.2s	remaining: 15.5s
161:	learn: 0.0333343	total: 19.3s	remaining: 15.4s
162:	learn: 0.0331299	total: 19.4s	remaining: 15.2s
163:	learn: 0.0329401	total: 19.5s	remaining: 15.1s
164:	learn: 0.0327472	total: 19.6s	remaining: 14.9s
165:	learn: 0.0325185	total: 19.7s	remaining: 14.8s
166:	learn: 0.0322168	total: 19.8s	remaining: 14.7s
167:	learn: 0.0319919	total: 19.9s	remaining: 14.5s
168:	learn: 0.0317822	total: 19.9s	remaining: 14.4s
169:	learn: 0.0316502	total: 20s	remaining: 14.3s
170:	learn: 0.0314812	total: 20.1s	remaining: 14.1s
171:	learn: 0.0313232	total: 20.2s	remaining: 14s
172:	learn: 0.0311547	total: 20.3s	remaining: 13.9s
173:	learn: 0.0310229	total: 20.4s	remaining: 13.7s
174:	learn: 0.0308542	total: 20.5s	remaining: 13.6s
175:	learn: 0.0305505	total: 20.6s	remaining: 13.5s
176:	learn: 0.0303355	total: 20.7s	remaining: 13.3s
177:	learn: 0.0300911	total: 20.8s	remaining: 13.2s
178:	learn: 0.0299366	total: 20.9s	remaining: 13.1s
179:	learn: 0.0297774	total: 21s	remaining: 13s
180:	learn: 0.0296185	total: 21.1s	remaining: 12.8s
181:	learn: 0.0294956	total: 21.2s	remaining: 12.7s
182:	learn: 0.0293595	total: 21.3s	remaining: 12.6s
183:	learn: 0.0292075	total: 21.4s	remaining: 12.4s
184:	learn: 0.0290581	total: 21.5s	remaining: 12.3s
185:	learn: 0.0289389	total: 21.6s	remaining: 12.2s
186:	learn: 0.0287716	total: 21.7s	remaining: 12s
187:	learn: 0.0286050	total: 21.8s	remaining: 11.9s
188:	learn: 0.0284034	total: 21.9s	remaining: 11.8s
189:	learn: 0.0283160	total: 21.9s	remaining: 11.7s
190:	learn: 0.0281343	total: 22s	remaining: 11.5s
191:	learn: 0.0280213	total: 22.1s	remaining: 11.4s
192:	learn: 0.0278853	total: 22.2s	remaining: 11.3s
193:	learn: 0.0277443	total: 22.3s	remaining: 11.2s
194:	learn: 0.0276423	total: 22.4s	remaining: 11s
195:	learn: 0.0275183	total: 22.5s	remaining: 10.9s
196:	learn: 0.0273570	total: 22.6s	remaining: 10.8s
197:	learn: 0.0272002	total: 22.7s	remaining: 10.7s
198:	learn: 0.0271238	total: 22.8s	remaining: 10.5s
199:	learn: 0.0270289	total: 22.9s	remaining: 10.4s
200:	learn: 0.0268416	total: 23s	remaining: 10.3s
201:	learn: 0.0266881	total: 23.1s	remaining: 10.2s
202:	learn: 0.0264990	total: 23.1s	remaining: 10s
203:	learn: 0.0263795	total: 23.2s	remaining: 9.91s
204:	learn: 0.0262580	total: 23.3s	remaining: 9.79s
205:	learn: 0.0261664	total: 23.4s	remaining: 9.67s
206:	learn: 0.0260207	total: 23.5s	remaining: 9.55s
207:	learn: 0.0259191	total: 23.6s	remaining: 9.43s
208:	learn: 0.0257814	total: 23.7s	remaining: 9.3s
209:	learn: 0.0257171	total: 23.8s	remaining: 9.18s
210:	learn: 0.0256356	total: 23.9s	remaining: 9.06s
211:	learn: 0.0255258	total: 24s	remaining: 8.93s
212:	learn: 0.0254279	total: 24.1s	remaining: 8.81s
213:	learn: 0.0253353	total: 24.2s	remaining: 8.69s
214:	learn: 0.0252543	total: 24.3s	remaining: 8.57s
215:	learn: 0.0251213	total: 24.3s	remaining: 8.45s
216:	learn: 0.0250070	total: 24.4s	remaining: 8.34s
217:	learn: 0.0248831	total: 24.5s	remaining: 8.22s
218:	learn: 0.0248113	total: 24.6s	remaining: 8.1s
219:	learn: 0.0246903	total: 24.7s	remaining: 7.98s
220:	learn: 0.0246259	total: 24.8s	remaining: 7.86s
221:	learn: 0.0245070	total: 24.9s	remaining: 7.74s
222:	learn: 0.0243863	total: 25s	remaining: 7.62s
223:	learn: 0.0242869	total: 25.1s	remaining: 7.5s
224:	learn: 0.0241666	total: 25.2s	remaining: 7.38s
225:	learn: 0.0239706	total: 25.3s	remaining: 7.27s
226:	learn: 0.0238450	total: 25.4s	remaining: 7.15s
227:	learn: 0.0237829	total: 25.5s	remaining: 7.03s
228:	learn: 0.0236966	total: 25.6s	remaining: 6.92s
229:	learn: 0.0235957	total: 25.6s	remaining: 6.8s
230:	learn: 0.0234789	total: 25.7s	remaining: 6.68s
231:	learn: 0.0233116	total: 25.8s	remaining: 6.57s
232:	learn: 0.0232222	total: 25.9s	remaining: 6.45s
233:	learn: 0.0231627	total: 26s	remaining: 6.33s
234:	learn: 0.0230783	total: 26.1s	remaining: 6.22s
235:	learn: 0.0230109	total: 26.2s	remaining: 6.1s
236:	learn: 0.0229248	total: 26.3s	remaining: 5.99s
237:	learn: 0.0228134	total: 26.4s	remaining: 5.87s
238:	learn: 0.0227088	total: 26.4s	remaining: 5.75s
239:	learn: 0.0225960	total: 26.5s	remaining: 5.64s
240:	learn: 0.0225077	total: 26.6s	remaining: 5.53s
241:	learn: 0.0223944	total: 26.7s	remaining: 5.41s
242:	learn: 0.0222820	total: 26.8s	remaining: 5.3s
243:	learn: 0.0221660	total: 26.9s	remaining: 5.19s
244:	learn: 0.0220848	total: 27s	remaining: 5.07s
245:	learn: 0.0219974	total: 27.1s	remaining: 4.96s
246:	learn: 0.0219177	total: 27.2s	remaining: 4.85s
247:	learn: 0.0218294	total: 27.3s	remaining: 4.73s
248:	learn: 0.0217436	total: 27.4s	remaining: 4.62s
249:	learn: 0.0216805	total: 27.5s	remaining: 4.5s
250:	learn: 0.0216165	total: 27.6s	remaining: 4.39s
251:	learn: 0.0214642	total: 27.7s	remaining: 4.28s
252:	learn: 0.0213926	total: 27.7s	remaining: 4.17s
253:	learn: 0.0213139	total: 27.8s	remaining: 4.05s
254:	learn: 0.0212516	total: 27.9s	remaining: 3.94s
255:	learn: 0.0210941	total: 28s	remaining: 3.83s
256:	learn: 0.0210394	total: 28.1s	remaining: 3.72s
257:	learn: 0.0209351	total: 28.2s	remaining: 3.61s
258:	learn: 0.0208654	total: 28.3s	remaining: 3.5s
259:	learn: 0.0207557	total: 28.4s	remaining: 3.39s
260:	learn: 0.0206826	total: 28.6s	remaining: 3.29s
261:	learn: 0.0205752	total: 28.8s	remaining: 3.18s
262:	learn: 0.0205130	total: 28.9s	remaining: 3.08s
263:	learn: 0.0204343	total: 29.1s	remaining: 2.98s
264:	learn: 0.0203242	total: 29.3s	remaining: 2.87s
265:	learn: 0.0202681	total: 29.4s	remaining: 2.76s
266:	learn: 0.0202132	total: 29.6s	remaining: 2.66s
267:	learn: 0.0201043	total: 29.8s	remaining: 2.56s
268:	learn: 0.0200016	total: 30s	remaining: 2.45s
269:	learn: 0.0198953	total: 30.1s	remaining: 2.34s
270:	learn: 0.0198486	total: 30.3s	remaining: 2.23s
271:	learn: 0.0197396	total: 30.4s	remaining: 2.13s
272:	learn: 0.0196678	total: 30.6s	remaining: 2.02s
273:	learn: 0.0196156	total: 30.8s	remaining: 1.91s
274:	learn: 0.0195565	total: 31s	remaining: 1.8s
275:	learn: 0.0194697	total: 31.2s	remaining: 1.69s
276:	learn: 0.0193994	total: 31.4s	remaining: 1.58s
277:	learn: 0.0193482	total: 31.5s	remaining: 1.47s
278:	learn: 0.0192734	total: 31.7s	remaining: 1.36s
279:	learn: 0.0191495	total: 31.8s	remaining: 1.25s
280:	learn: 0.0190467	total: 32s	remaining: 1.14s
281:	learn: 0.0189537	total: 32.2s	remaining: 1.03s
282:	learn: 0.0189010	total: 32.4s	remaining: 915ms
283:	learn: 0.0188427	total: 32.5s	remaining: 802ms
284:	learn: 0.0187847	total: 32.7s	remaining: 689ms
285:	learn: 0.0186797	total: 32.9s	remaining: 575ms
286:	learn: 0.0186196	total: 33.1s	remaining: 461ms
287:	learn: 0.0185659	total: 33.2s	remaining: 346ms
288:	learn: 0.0185074	total: 33.4s	remaining: 231ms
289:	learn: 0.0184603	total: 33.6s	remaining: 116ms
290:	learn: 0.0183967	total: 33.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.09
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 97.42
 - F1-Score_Train: 97.48
 - Precision_Test: 3.07
 - Recall_Test: 96.83
 - AUPRC_Test: 70.08
 - Accuracy_Test: 94.85
 - F1-Score_Test: 5.95
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6609791	total: 95.7ms	remaining: 27.8s
1:	learn: 0.6211871	total: 183ms	remaining: 26.5s
2:	learn: 0.5951111	total: 275ms	remaining: 26.4s
3:	learn: 0.5659838	total: 370ms	remaining: 26.5s
4:	learn: 0.5385403	total: 454ms	remaining: 25.9s
5:	learn: 0.5075492	total: 545ms	remaining: 25.9s
6:	learn: 0.4828915	total: 642ms	remaining: 26.1s
7:	learn: 0.4658243	total: 732ms	remaining: 25.9s
8:	learn: 0.4397027	total: 823ms	remaining: 25.8s
9:	learn: 0.4192101	total: 921ms	remaining: 25.9s
10:	learn: 0.3964095	total: 1.01s	remaining: 25.7s
11:	learn: 0.3797722	total: 1.11s	remaining: 25.7s
12:	learn: 0.3627149	total: 1.2s	remaining: 25.6s
13:	learn: 0.3480130	total: 1.29s	remaining: 25.6s
14:	learn: 0.3330134	total: 1.38s	remaining: 25.3s
15:	learn: 0.3171300	total: 1.47s	remaining: 25.3s
16:	learn: 0.3039392	total: 1.56s	remaining: 25.1s
17:	learn: 0.2897153	total: 1.65s	remaining: 25s
18:	learn: 0.2745289	total: 1.75s	remaining: 25s
19:	learn: 0.2613482	total: 1.84s	remaining: 25s
20:	learn: 0.2498986	total: 1.93s	remaining: 24.8s
21:	learn: 0.2404441	total: 2.03s	remaining: 24.8s
22:	learn: 0.2306802	total: 2.11s	remaining: 24.6s
23:	learn: 0.2199048	total: 2.23s	remaining: 24.8s
24:	learn: 0.2110346	total: 2.34s	remaining: 24.9s
25:	learn: 0.2027959	total: 2.43s	remaining: 24.7s
26:	learn: 0.1955926	total: 2.5s	remaining: 24.5s
27:	learn: 0.1866472	total: 2.61s	remaining: 24.5s
28:	learn: 0.1796604	total: 2.7s	remaining: 24.4s
29:	learn: 0.1742854	total: 2.79s	remaining: 24.3s
30:	learn: 0.1668481	total: 2.89s	remaining: 24.2s
31:	learn: 0.1613973	total: 2.98s	remaining: 24.1s
32:	learn: 0.1552273	total: 3.06s	remaining: 24s
33:	learn: 0.1496695	total: 3.18s	remaining: 24s
34:	learn: 0.1448563	total: 3.26s	remaining: 23.9s
35:	learn: 0.1404075	total: 3.36s	remaining: 23.8s
36:	learn: 0.1363263	total: 3.45s	remaining: 23.7s
37:	learn: 0.1325520	total: 3.53s	remaining: 23.5s
38:	learn: 0.1275531	total: 3.62s	remaining: 23.4s
39:	learn: 0.1240853	total: 3.72s	remaining: 23.3s
40:	learn: 0.1208259	total: 3.81s	remaining: 23.2s
41:	learn: 0.1176995	total: 3.9s	remaining: 23.1s
42:	learn: 0.1149789	total: 3.99s	remaining: 23s
43:	learn: 0.1123432	total: 4.08s	remaining: 22.9s
44:	learn: 0.1102578	total: 4.17s	remaining: 22.8s
45:	learn: 0.1067451	total: 4.29s	remaining: 22.8s
46:	learn: 0.1041883	total: 4.37s	remaining: 22.7s
47:	learn: 0.1011804	total: 4.46s	remaining: 22.6s
48:	learn: 0.0981074	total: 4.56s	remaining: 22.5s
49:	learn: 0.0952155	total: 4.66s	remaining: 22.4s
50:	learn: 0.0936448	total: 4.74s	remaining: 22.3s
51:	learn: 0.0922790	total: 4.83s	remaining: 22.2s
52:	learn: 0.0898417	total: 4.92s	remaining: 22.1s
53:	learn: 0.0879029	total: 5.02s	remaining: 22s
54:	learn: 0.0862400	total: 5.12s	remaining: 22s
55:	learn: 0.0846486	total: 5.22s	remaining: 21.9s
56:	learn: 0.0824243	total: 5.32s	remaining: 21.8s
57:	learn: 0.0803139	total: 5.42s	remaining: 21.8s
58:	learn: 0.0787821	total: 5.51s	remaining: 21.7s
59:	learn: 0.0774961	total: 5.59s	remaining: 21.5s
60:	learn: 0.0758295	total: 5.69s	remaining: 21.5s
61:	learn: 0.0740507	total: 5.79s	remaining: 21.4s
62:	learn: 0.0730856	total: 5.88s	remaining: 21.3s
63:	learn: 0.0715974	total: 5.99s	remaining: 21.2s
64:	learn: 0.0702311	total: 6.07s	remaining: 21.1s
65:	learn: 0.0691612	total: 6.16s	remaining: 21s
66:	learn: 0.0679454	total: 6.28s	remaining: 21s
67:	learn: 0.0665467	total: 6.37s	remaining: 20.9s
68:	learn: 0.0656249	total: 6.45s	remaining: 20.8s
69:	learn: 0.0648141	total: 6.54s	remaining: 20.7s
70:	learn: 0.0640723	total: 6.64s	remaining: 20.6s
71:	learn: 0.0629599	total: 6.73s	remaining: 20.5s
72:	learn: 0.0617752	total: 6.83s	remaining: 20.4s
73:	learn: 0.0611358	total: 6.92s	remaining: 20.3s
74:	learn: 0.0604650	total: 7.01s	remaining: 20.2s
75:	learn: 0.0598335	total: 7.11s	remaining: 20.1s
76:	learn: 0.0592685	total: 7.19s	remaining: 20s
77:	learn: 0.0585404	total: 7.3s	remaining: 19.9s
78:	learn: 0.0577674	total: 7.41s	remaining: 19.9s
79:	learn: 0.0568783	total: 7.51s	remaining: 19.8s
80:	learn: 0.0558688	total: 7.63s	remaining: 19.8s
81:	learn: 0.0551691	total: 7.73s	remaining: 19.7s
82:	learn: 0.0545356	total: 7.88s	remaining: 19.7s
83:	learn: 0.0540316	total: 8.04s	remaining: 19.8s
84:	learn: 0.0534912	total: 8.2s	remaining: 19.9s
85:	learn: 0.0528794	total: 8.38s	remaining: 20s
86:	learn: 0.0523264	total: 8.53s	remaining: 20s
87:	learn: 0.0517210	total: 8.71s	remaining: 20.1s
88:	learn: 0.0511721	total: 8.86s	remaining: 20.1s
89:	learn: 0.0505763	total: 9.04s	remaining: 20.2s
90:	learn: 0.0500348	total: 9.23s	remaining: 20.3s
91:	learn: 0.0494654	total: 9.42s	remaining: 20.4s
92:	learn: 0.0490564	total: 9.59s	remaining: 20.4s
93:	learn: 0.0483452	total: 9.76s	remaining: 20.5s
94:	learn: 0.0479196	total: 9.95s	remaining: 20.5s
95:	learn: 0.0473397	total: 10.1s	remaining: 20.5s
96:	learn: 0.0466802	total: 10.3s	remaining: 20.6s
97:	learn: 0.0461050	total: 10.5s	remaining: 20.6s
98:	learn: 0.0455002	total: 10.7s	remaining: 20.7s
99:	learn: 0.0450623	total: 10.8s	remaining: 20.7s
100:	learn: 0.0446429	total: 11s	remaining: 20.7s
101:	learn: 0.0443183	total: 11.2s	remaining: 20.7s
102:	learn: 0.0438870	total: 11.3s	remaining: 20.7s
103:	learn: 0.0434548	total: 11.5s	remaining: 20.7s
104:	learn: 0.0429098	total: 11.7s	remaining: 20.7s
105:	learn: 0.0424393	total: 11.9s	remaining: 20.7s
106:	learn: 0.0419825	total: 12.1s	remaining: 20.7s
107:	learn: 0.0415708	total: 12.3s	remaining: 20.8s
108:	learn: 0.0412378	total: 12.4s	remaining: 20.8s
109:	learn: 0.0409082	total: 12.6s	remaining: 20.8s
110:	learn: 0.0405065	total: 12.8s	remaining: 20.7s
111:	learn: 0.0401876	total: 12.9s	remaining: 20.6s
112:	learn: 0.0397942	total: 13s	remaining: 20.5s
113:	learn: 0.0394822	total: 13.1s	remaining: 20.4s
114:	learn: 0.0390547	total: 13.2s	remaining: 20.2s
115:	learn: 0.0386929	total: 13.3s	remaining: 20.1s
116:	learn: 0.0382990	total: 13.4s	remaining: 19.9s
117:	learn: 0.0379233	total: 13.5s	remaining: 19.8s
118:	learn: 0.0375735	total: 13.6s	remaining: 19.7s
119:	learn: 0.0371997	total: 13.7s	remaining: 19.5s
120:	learn: 0.0369103	total: 13.8s	remaining: 19.4s
121:	learn: 0.0366053	total: 13.9s	remaining: 19.3s
122:	learn: 0.0363226	total: 14s	remaining: 19.1s
123:	learn: 0.0359585	total: 14.1s	remaining: 19s
124:	learn: 0.0356809	total: 14.2s	remaining: 18.8s
125:	learn: 0.0354180	total: 14.3s	remaining: 18.7s
126:	learn: 0.0351163	total: 14.4s	remaining: 18.6s
127:	learn: 0.0348574	total: 14.5s	remaining: 18.4s
128:	learn: 0.0345973	total: 14.5s	remaining: 18.3s
129:	learn: 0.0343508	total: 14.7s	remaining: 18.1s
130:	learn: 0.0341068	total: 14.8s	remaining: 18s
131:	learn: 0.0338582	total: 14.8s	remaining: 17.9s
132:	learn: 0.0336093	total: 14.9s	remaining: 17.7s
133:	learn: 0.0333711	total: 15s	remaining: 17.6s
134:	learn: 0.0330613	total: 15.1s	remaining: 17.5s
135:	learn: 0.0328591	total: 15.2s	remaining: 17.4s
136:	learn: 0.0326548	total: 15.3s	remaining: 17.2s
137:	learn: 0.0324152	total: 15.4s	remaining: 17.1s
138:	learn: 0.0321701	total: 15.5s	remaining: 17s
139:	learn: 0.0319852	total: 15.6s	remaining: 16.8s
140:	learn: 0.0317565	total: 15.7s	remaining: 16.7s
141:	learn: 0.0315470	total: 15.8s	remaining: 16.6s
142:	learn: 0.0313256	total: 15.9s	remaining: 16.5s
143:	learn: 0.0311317	total: 16s	remaining: 16.3s
144:	learn: 0.0308829	total: 16.1s	remaining: 16.2s
145:	learn: 0.0306861	total: 16.2s	remaining: 16.1s
146:	learn: 0.0304672	total: 16.3s	remaining: 15.9s
147:	learn: 0.0303003	total: 16.4s	remaining: 15.8s
148:	learn: 0.0301245	total: 16.5s	remaining: 15.7s
149:	learn: 0.0299398	total: 16.6s	remaining: 15.6s
150:	learn: 0.0297519	total: 16.6s	remaining: 15.4s
151:	learn: 0.0294843	total: 16.8s	remaining: 15.3s
152:	learn: 0.0292781	total: 16.9s	remaining: 15.2s
153:	learn: 0.0290894	total: 17s	remaining: 15.1s
154:	learn: 0.0289056	total: 17s	remaining: 15s
155:	learn: 0.0287412	total: 17.1s	remaining: 14.8s
156:	learn: 0.0285940	total: 17.3s	remaining: 14.7s
157:	learn: 0.0284362	total: 17.3s	remaining: 14.6s
158:	learn: 0.0282803	total: 17.4s	remaining: 14.5s
159:	learn: 0.0280553	total: 17.5s	remaining: 14.3s
160:	learn: 0.0279108	total: 17.6s	remaining: 14.2s
161:	learn: 0.0277463	total: 17.7s	remaining: 14.1s
162:	learn: 0.0275683	total: 17.8s	remaining: 14s
163:	learn: 0.0273769	total: 17.9s	remaining: 13.9s
164:	learn: 0.0272227	total: 18s	remaining: 13.8s
165:	learn: 0.0270962	total: 18.1s	remaining: 13.6s
166:	learn: 0.0269356	total: 18.2s	remaining: 13.5s
167:	learn: 0.0267482	total: 18.3s	remaining: 13.4s
168:	learn: 0.0266045	total: 18.4s	remaining: 13.3s
169:	learn: 0.0263900	total: 18.5s	remaining: 13.2s
170:	learn: 0.0262671	total: 18.6s	remaining: 13s
171:	learn: 0.0261238	total: 18.7s	remaining: 12.9s
172:	learn: 0.0260052	total: 18.8s	remaining: 12.8s
173:	learn: 0.0258375	total: 18.9s	remaining: 12.7s
174:	learn: 0.0257247	total: 18.9s	remaining: 12.6s
175:	learn: 0.0255745	total: 19s	remaining: 12.4s
176:	learn: 0.0254350	total: 19.1s	remaining: 12.3s
177:	learn: 0.0252606	total: 19.2s	remaining: 12.2s
178:	learn: 0.0251529	total: 19.3s	remaining: 12.1s
179:	learn: 0.0250275	total: 19.4s	remaining: 12s
180:	learn: 0.0248907	total: 19.5s	remaining: 11.9s
181:	learn: 0.0247645	total: 19.6s	remaining: 11.7s
182:	learn: 0.0246241	total: 19.7s	remaining: 11.6s
183:	learn: 0.0244848	total: 19.8s	remaining: 11.5s
184:	learn: 0.0243672	total: 19.9s	remaining: 11.4s
185:	learn: 0.0242558	total: 20s	remaining: 11.3s
186:	learn: 0.0241320	total: 20.1s	remaining: 11.2s
187:	learn: 0.0240147	total: 20.2s	remaining: 11.1s
188:	learn: 0.0238870	total: 20.3s	remaining: 10.9s
189:	learn: 0.0237351	total: 20.4s	remaining: 10.8s
190:	learn: 0.0236136	total: 20.5s	remaining: 10.7s
191:	learn: 0.0234906	total: 20.6s	remaining: 10.6s
192:	learn: 0.0234139	total: 20.6s	remaining: 10.5s
193:	learn: 0.0232814	total: 20.7s	remaining: 10.4s
194:	learn: 0.0231764	total: 20.8s	remaining: 10.3s
195:	learn: 0.0230667	total: 20.9s	remaining: 10.1s
196:	learn: 0.0229676	total: 21s	remaining: 10s
197:	learn: 0.0228810	total: 21.1s	remaining: 9.92s
198:	learn: 0.0227343	total: 21.2s	remaining: 9.81s
199:	learn: 0.0226483	total: 21.3s	remaining: 9.69s
200:	learn: 0.0225118	total: 21.4s	remaining: 9.59s
201:	learn: 0.0224024	total: 21.5s	remaining: 9.47s
202:	learn: 0.0223006	total: 21.6s	remaining: 9.35s
203:	learn: 0.0221700	total: 21.7s	remaining: 9.24s
204:	learn: 0.0220298	total: 21.8s	remaining: 9.13s
205:	learn: 0.0218852	total: 21.9s	remaining: 9.02s
206:	learn: 0.0217750	total: 22s	remaining: 8.91s
207:	learn: 0.0216650	total: 22.1s	remaining: 8.8s
208:	learn: 0.0215727	total: 22.1s	remaining: 8.69s
209:	learn: 0.0214956	total: 22.2s	remaining: 8.57s
210:	learn: 0.0213920	total: 22.3s	remaining: 8.46s
211:	learn: 0.0213031	total: 22.4s	remaining: 8.35s
212:	learn: 0.0212237	total: 22.5s	remaining: 8.24s
213:	learn: 0.0211464	total: 22.6s	remaining: 8.13s
214:	learn: 0.0210863	total: 22.7s	remaining: 8.02s
215:	learn: 0.0209559	total: 22.8s	remaining: 7.91s
216:	learn: 0.0208693	total: 22.9s	remaining: 7.82s
217:	learn: 0.0207908	total: 23s	remaining: 7.72s
218:	learn: 0.0206905	total: 23.2s	remaining: 7.62s
219:	learn: 0.0205811	total: 23.4s	remaining: 7.54s
220:	learn: 0.0204137	total: 23.5s	remaining: 7.46s
221:	learn: 0.0203386	total: 23.7s	remaining: 7.38s
222:	learn: 0.0202543	total: 23.9s	remaining: 7.29s
223:	learn: 0.0201686	total: 24.1s	remaining: 7.2s
224:	learn: 0.0201076	total: 24.3s	remaining: 7.11s
225:	learn: 0.0199938	total: 24.4s	remaining: 7.03s
226:	learn: 0.0199242	total: 24.6s	remaining: 6.93s
227:	learn: 0.0198587	total: 24.8s	remaining: 6.84s
228:	learn: 0.0197361	total: 25s	remaining: 6.76s
229:	learn: 0.0196682	total: 25.1s	remaining: 6.67s
230:	learn: 0.0195859	total: 25.3s	remaining: 6.58s
231:	learn: 0.0195146	total: 25.5s	remaining: 6.48s
232:	learn: 0.0193982	total: 25.7s	remaining: 6.39s
233:	learn: 0.0192881	total: 25.9s	remaining: 6.3s
234:	learn: 0.0192194	total: 26s	remaining: 6.2s
235:	learn: 0.0191588	total: 26.2s	remaining: 6.11s
236:	learn: 0.0190706	total: 26.4s	remaining: 6.01s
237:	learn: 0.0189983	total: 26.5s	remaining: 5.91s
238:	learn: 0.0189217	total: 26.7s	remaining: 5.81s
239:	learn: 0.0188627	total: 26.9s	remaining: 5.71s
240:	learn: 0.0187943	total: 27.1s	remaining: 5.61s
241:	learn: 0.0187036	total: 27.2s	remaining: 5.51s
242:	learn: 0.0186120	total: 27.4s	remaining: 5.41s
243:	learn: 0.0185410	total: 27.6s	remaining: 5.31s
244:	learn: 0.0184985	total: 27.7s	remaining: 5.21s
245:	learn: 0.0184359	total: 27.9s	remaining: 5.11s
246:	learn: 0.0183847	total: 28.1s	remaining: 5s
247:	learn: 0.0183230	total: 28.2s	remaining: 4.89s
248:	learn: 0.0182435	total: 28.3s	remaining: 4.77s
249:	learn: 0.0181420	total: 28.4s	remaining: 4.66s
250:	learn: 0.0180876	total: 28.5s	remaining: 4.54s
251:	learn: 0.0180363	total: 28.6s	remaining: 4.42s
252:	learn: 0.0179830	total: 28.6s	remaining: 4.3s
253:	learn: 0.0179310	total: 28.7s	remaining: 4.19s
254:	learn: 0.0178958	total: 28.8s	remaining: 4.07s
255:	learn: 0.0178376	total: 28.9s	remaining: 3.95s
256:	learn: 0.0177659	total: 29s	remaining: 3.84s
257:	learn: 0.0176806	total: 29.1s	remaining: 3.72s
258:	learn: 0.0176470	total: 29.2s	remaining: 3.61s
259:	learn: 0.0175646	total: 29.3s	remaining: 3.49s
260:	learn: 0.0174974	total: 29.4s	remaining: 3.38s
261:	learn: 0.0174468	total: 29.5s	remaining: 3.26s
262:	learn: 0.0174072	total: 29.6s	remaining: 3.15s
263:	learn: 0.0173488	total: 29.6s	remaining: 3.03s
264:	learn: 0.0172655	total: 29.7s	remaining: 2.92s
265:	learn: 0.0172044	total: 29.8s	remaining: 2.8s
266:	learn: 0.0171534	total: 29.9s	remaining: 2.69s
267:	learn: 0.0171106	total: 30s	remaining: 2.57s
268:	learn: 0.0170643	total: 30.1s	remaining: 2.46s
269:	learn: 0.0169703	total: 30.2s	remaining: 2.35s
270:	learn: 0.0169256	total: 30.3s	remaining: 2.23s
271:	learn: 0.0168190	total: 30.4s	remaining: 2.12s
272:	learn: 0.0167489	total: 30.5s	remaining: 2.01s
273:	learn: 0.0166625	total: 30.6s	remaining: 1.9s
274:	learn: 0.0166219	total: 30.7s	remaining: 1.78s
275:	learn: 0.0165853	total: 30.7s	remaining: 1.67s
276:	learn: 0.0164855	total: 30.8s	remaining: 1.56s
277:	learn: 0.0164249	total: 30.9s	remaining: 1.45s
278:	learn: 0.0163659	total: 31s	remaining: 1.33s
279:	learn: 0.0163088	total: 31.1s	remaining: 1.22s
280:	learn: 0.0162249	total: 31.2s	remaining: 1.11s
281:	learn: 0.0161356	total: 31.3s	remaining: 999ms
282:	learn: 0.0160640	total: 31.4s	remaining: 888ms
283:	learn: 0.0160249	total: 31.5s	remaining: 776ms
284:	learn: 0.0159739	total: 31.6s	remaining: 665ms
285:	learn: 0.0158886	total: 31.7s	remaining: 554ms
286:	learn: 0.0158452	total: 31.8s	remaining: 443ms
287:	learn: 0.0157972	total: 31.9s	remaining: 332ms
288:	learn: 0.0157431	total: 31.9s	remaining: 221ms
289:	learn: 0.0156872	total: 32s	remaining: 110ms
290:	learn: 0.0156574	total: 32.1s	remaining: 0us
[I 2024-12-19 14:11:43,682] Trial 3 finished with value: 70.27439832653555 and parameters: {'learning_rate': 0.011484619114104772, 'max_depth': 6, 'n_estimators': 291, 'scale_pos_weight': 13.73854959933593}. Best is trial 0 with value: 71.43498362547278.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.00
 - Recall_Train: 100.00
 - AUPRC_Train: 99.91
 - Accuracy_Train: 97.92
 - F1-Score_Train: 97.96
 - Precision_Test: 3.47
 - Recall_Test: 90.48
 - AUPRC_Test: 68.82
 - Accuracy_Test: 95.75
 - F1-Score_Test: 6.68
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 70.2744

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4831325	total: 88ms	remaining: 15.6s
1:	learn: 0.3218375	total: 179ms	remaining: 15.7s
2:	learn: 0.2252414	total: 267ms	remaining: 15.6s
3:	learn: 0.1737974	total: 362ms	remaining: 15.8s
4:	learn: 0.1282940	total: 459ms	remaining: 15.9s
5:	learn: 0.1061082	total: 542ms	remaining: 15.5s
6:	learn: 0.0856105	total: 640ms	remaining: 15.6s
7:	learn: 0.0759760	total: 732ms	remaining: 15.6s
8:	learn: 0.0641582	total: 830ms	remaining: 15.6s
9:	learn: 0.0557434	total: 931ms	remaining: 15.6s
10:	learn: 0.0473751	total: 1.03s	remaining: 15.7s
11:	learn: 0.0418991	total: 1.13s	remaining: 15.7s
12:	learn: 0.0390677	total: 1.23s	remaining: 15.6s
13:	learn: 0.0369055	total: 1.31s	remaining: 15.3s
14:	learn: 0.0342282	total: 1.41s	remaining: 15.3s
15:	learn: 0.0323007	total: 1.5s	remaining: 15.2s
16:	learn: 0.0308569	total: 1.58s	remaining: 14.9s
17:	learn: 0.0287913	total: 1.68s	remaining: 14.9s
18:	learn: 0.0272396	total: 1.78s	remaining: 14.9s
19:	learn: 0.0262286	total: 1.86s	remaining: 14.7s
20:	learn: 0.0247161	total: 1.96s	remaining: 14.7s
21:	learn: 0.0232700	total: 2.06s	remaining: 14.6s
22:	learn: 0.0220907	total: 2.15s	remaining: 14.5s
23:	learn: 0.0210039	total: 2.24s	remaining: 14.4s
24:	learn: 0.0201882	total: 2.33s	remaining: 14.2s
25:	learn: 0.0196997	total: 2.42s	remaining: 14.1s
26:	learn: 0.0187218	total: 2.51s	remaining: 14s
27:	learn: 0.0180998	total: 2.59s	remaining: 13.9s
28:	learn: 0.0173646	total: 2.7s	remaining: 13.9s
29:	learn: 0.0168808	total: 2.8s	remaining: 13.8s
30:	learn: 0.0162927	total: 2.89s	remaining: 13.7s
31:	learn: 0.0157677	total: 2.98s	remaining: 13.6s
32:	learn: 0.0151883	total: 3.09s	remaining: 13.6s
33:	learn: 0.0148692	total: 3.2s	remaining: 13.6s
34:	learn: 0.0144229	total: 3.3s	remaining: 13.5s
35:	learn: 0.0138771	total: 3.38s	remaining: 13.3s
36:	learn: 0.0135037	total: 3.47s	remaining: 13.2s
37:	learn: 0.0130083	total: 3.58s	remaining: 13.2s
38:	learn: 0.0126717	total: 3.66s	remaining: 13.1s
39:	learn: 0.0123975	total: 3.76s	remaining: 13s
40:	learn: 0.0119356	total: 3.92s	remaining: 13.1s
41:	learn: 0.0116015	total: 4.08s	remaining: 13.2s
42:	learn: 0.0112807	total: 4.26s	remaining: 13.4s
43:	learn: 0.0107954	total: 4.43s	remaining: 13.5s
44:	learn: 0.0105502	total: 4.6s	remaining: 13.6s
45:	learn: 0.0102286	total: 4.77s	remaining: 13.7s
46:	learn: 0.0099373	total: 4.95s	remaining: 13.8s
47:	learn: 0.0097762	total: 5.09s	remaining: 13.8s
48:	learn: 0.0095237	total: 5.26s	remaining: 13.9s
49:	learn: 0.0092759	total: 5.43s	remaining: 13.9s
50:	learn: 0.0090946	total: 5.57s	remaining: 13.9s
51:	learn: 0.0087881	total: 5.74s	remaining: 13.9s
52:	learn: 0.0086926	total: 5.91s	remaining: 13.9s
53:	learn: 0.0083859	total: 6.09s	remaining: 14s
54:	learn: 0.0082415	total: 6.27s	remaining: 14s
55:	learn: 0.0080645	total: 6.45s	remaining: 14.1s
56:	learn: 0.0078922	total: 6.63s	remaining: 14.1s
57:	learn: 0.0077096	total: 6.8s	remaining: 14.1s
58:	learn: 0.0076100	total: 6.98s	remaining: 14.1s
59:	learn: 0.0074380	total: 7.13s	remaining: 14s
60:	learn: 0.0072901	total: 7.32s	remaining: 14s
61:	learn: 0.0071973	total: 7.47s	remaining: 14s
62:	learn: 0.0071153	total: 7.61s	remaining: 13.9s
63:	learn: 0.0069397	total: 7.78s	remaining: 13.9s
64:	learn: 0.0068296	total: 7.96s	remaining: 13.8s
65:	learn: 0.0066414	total: 8.12s	remaining: 13.8s
66:	learn: 0.0064950	total: 8.31s	remaining: 13.8s
67:	learn: 0.0062342	total: 8.49s	remaining: 13.7s
68:	learn: 0.0061460	total: 8.67s	remaining: 13.7s
69:	learn: 0.0060546	total: 8.83s	remaining: 13.6s
70:	learn: 0.0059327	total: 9.01s	remaining: 13.6s
71:	learn: 0.0058398	total: 9.14s	remaining: 13.5s
72:	learn: 0.0056725	total: 9.24s	remaining: 13.3s
73:	learn: 0.0056010	total: 9.33s	remaining: 13.1s
74:	learn: 0.0054299	total: 9.44s	remaining: 13s
75:	learn: 0.0052769	total: 9.54s	remaining: 12.8s
76:	learn: 0.0052312	total: 9.62s	remaining: 12.6s
77:	learn: 0.0051646	total: 9.71s	remaining: 12.4s
78:	learn: 0.0050626	total: 9.8s	remaining: 12.3s
79:	learn: 0.0049957	total: 9.89s	remaining: 12.1s
80:	learn: 0.0049796	total: 9.97s	remaining: 11.9s
81:	learn: 0.0048217	total: 10.1s	remaining: 11.8s
82:	learn: 0.0047212	total: 10.2s	remaining: 11.6s
83:	learn: 0.0046316	total: 10.3s	remaining: 11.5s
84:	learn: 0.0045731	total: 10.3s	remaining: 11.3s
85:	learn: 0.0044591	total: 10.4s	remaining: 11.2s
86:	learn: 0.0043622	total: 10.5s	remaining: 11s
87:	learn: 0.0043133	total: 10.6s	remaining: 10.9s
88:	learn: 0.0042339	total: 10.7s	remaining: 10.7s
89:	learn: 0.0041679	total: 10.8s	remaining: 10.6s
90:	learn: 0.0041301	total: 10.9s	remaining: 10.4s
91:	learn: 0.0041301	total: 11s	remaining: 10.2s
92:	learn: 0.0041195	total: 11s	remaining: 10.1s
93:	learn: 0.0040656	total: 11.1s	remaining: 9.95s
94:	learn: 0.0040241	total: 11.2s	remaining: 9.8s
95:	learn: 0.0040081	total: 11.3s	remaining: 9.65s
96:	learn: 0.0039454	total: 11.4s	remaining: 9.53s
97:	learn: 0.0039087	total: 11.5s	remaining: 9.39s
98:	learn: 0.0038617	total: 11.6s	remaining: 9.25s
99:	learn: 0.0038116	total: 11.7s	remaining: 9.11s
100:	learn: 0.0038116	total: 11.8s	remaining: 8.97s
101:	learn: 0.0038116	total: 11.8s	remaining: 8.82s
102:	learn: 0.0037885	total: 11.9s	remaining: 8.68s
103:	learn: 0.0037061	total: 12s	remaining: 8.55s
104:	learn: 0.0036772	total: 12.1s	remaining: 8.41s
105:	learn: 0.0036271	total: 12.2s	remaining: 8.28s
106:	learn: 0.0036271	total: 12.3s	remaining: 8.14s
107:	learn: 0.0035594	total: 12.3s	remaining: 8.01s
108:	learn: 0.0035594	total: 12.4s	remaining: 7.87s
109:	learn: 0.0035593	total: 12.5s	remaining: 7.74s
110:	learn: 0.0035593	total: 12.6s	remaining: 7.59s
111:	learn: 0.0035093	total: 12.7s	remaining: 7.47s
112:	learn: 0.0034887	total: 12.8s	remaining: 7.34s
113:	learn: 0.0034887	total: 12.8s	remaining: 7.2s
114:	learn: 0.0034229	total: 12.9s	remaining: 7.08s
115:	learn: 0.0033861	total: 13s	remaining: 6.95s
116:	learn: 0.0033616	total: 13.1s	remaining: 6.83s
117:	learn: 0.0032794	total: 13.2s	remaining: 6.71s
118:	learn: 0.0032342	total: 13.3s	remaining: 6.59s
119:	learn: 0.0032090	total: 13.4s	remaining: 6.46s
120:	learn: 0.0031557	total: 13.5s	remaining: 6.35s
121:	learn: 0.0031557	total: 13.5s	remaining: 6.22s
122:	learn: 0.0030808	total: 13.6s	remaining: 6.1s
123:	learn: 0.0030379	total: 13.7s	remaining: 5.98s
124:	learn: 0.0030108	total: 13.8s	remaining: 5.86s
125:	learn: 0.0029510	total: 13.9s	remaining: 5.74s
126:	learn: 0.0029510	total: 14s	remaining: 5.62s
127:	learn: 0.0029510	total: 14.1s	remaining: 5.49s
128:	learn: 0.0029295	total: 14.1s	remaining: 5.37s
129:	learn: 0.0029295	total: 14.2s	remaining: 5.25s
130:	learn: 0.0028967	total: 14.3s	remaining: 5.13s
131:	learn: 0.0028966	total: 14.4s	remaining: 5.01s
132:	learn: 0.0028966	total: 14.4s	remaining: 4.89s
133:	learn: 0.0028966	total: 14.5s	remaining: 4.77s
134:	learn: 0.0028966	total: 14.6s	remaining: 4.65s
135:	learn: 0.0028966	total: 14.7s	remaining: 4.53s
136:	learn: 0.0028966	total: 14.8s	remaining: 4.42s
137:	learn: 0.0028835	total: 14.8s	remaining: 4.3s
138:	learn: 0.0028508	total: 14.9s	remaining: 4.19s
139:	learn: 0.0028211	total: 15s	remaining: 4.08s
140:	learn: 0.0027965	total: 15.1s	remaining: 3.97s
141:	learn: 0.0027582	total: 15.2s	remaining: 3.86s
142:	learn: 0.0027582	total: 15.3s	remaining: 3.74s
143:	learn: 0.0027581	total: 15.4s	remaining: 3.63s
144:	learn: 0.0027378	total: 15.5s	remaining: 3.52s
145:	learn: 0.0027378	total: 15.5s	remaining: 3.4s
146:	learn: 0.0027377	total: 15.6s	remaining: 3.29s
147:	learn: 0.0027376	total: 15.7s	remaining: 3.18s
148:	learn: 0.0027377	total: 15.8s	remaining: 3.07s
149:	learn: 0.0027376	total: 15.8s	remaining: 2.96s
150:	learn: 0.0027376	total: 15.9s	remaining: 2.85s
151:	learn: 0.0027375	total: 16s	remaining: 2.74s
152:	learn: 0.0027165	total: 16.1s	remaining: 2.63s
153:	learn: 0.0026795	total: 16.2s	remaining: 2.52s
154:	learn: 0.0026598	total: 16.3s	remaining: 2.41s
155:	learn: 0.0026314	total: 16.4s	remaining: 2.31s
156:	learn: 0.0026314	total: 16.4s	remaining: 2.2s
157:	learn: 0.0026314	total: 16.5s	remaining: 2.09s
158:	learn: 0.0026314	total: 16.6s	remaining: 1.98s
159:	learn: 0.0026314	total: 16.7s	remaining: 1.87s
160:	learn: 0.0026314	total: 16.7s	remaining: 1.77s
161:	learn: 0.0026034	total: 16.8s	remaining: 1.66s
162:	learn: 0.0026034	total: 16.9s	remaining: 1.55s
163:	learn: 0.0026033	total: 17s	remaining: 1.45s
164:	learn: 0.0026032	total: 17s	remaining: 1.34s
165:	learn: 0.0026032	total: 17.1s	remaining: 1.24s
166:	learn: 0.0026032	total: 17.2s	remaining: 1.13s
167:	learn: 0.0026031	total: 17.3s	remaining: 1.03s
168:	learn: 0.0026030	total: 17.3s	remaining: 923ms
169:	learn: 0.0025850	total: 17.4s	remaining: 820ms
170:	learn: 0.0025850	total: 17.5s	remaining: 716ms
171:	learn: 0.0025850	total: 17.6s	remaining: 613ms
172:	learn: 0.0025744	total: 17.7s	remaining: 510ms
173:	learn: 0.0025392	total: 17.8s	remaining: 408ms
174:	learn: 0.0025392	total: 17.8s	remaining: 306ms
175:	learn: 0.0025392	total: 17.9s	remaining: 203ms
176:	learn: 0.0025392	total: 18s	remaining: 101ms
177:	learn: 0.0025392	total: 18.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.47
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.74
 - F1-Score_Train: 99.74
 - Precision_Test: 19.07
 - Recall_Test: 88.10
 - AUPRC_Test: 78.09
 - Accuracy_Test: 99.35
 - F1-Score_Test: 31.36
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4711564	total: 172ms	remaining: 30.4s
1:	learn: 0.3394313	total: 338ms	remaining: 29.7s
2:	learn: 0.2596213	total: 506ms	remaining: 29.5s
3:	learn: 0.2020144	total: 670ms	remaining: 29.1s
4:	learn: 0.1606351	total: 835ms	remaining: 28.9s
5:	learn: 0.1407021	total: 1.01s	remaining: 28.9s
6:	learn: 0.1198238	total: 1.19s	remaining: 29.2s
7:	learn: 0.1081789	total: 1.37s	remaining: 29.2s
8:	learn: 0.0871550	total: 1.58s	remaining: 29.6s
9:	learn: 0.0772380	total: 1.75s	remaining: 29.4s
10:	learn: 0.0692909	total: 1.95s	remaining: 29.6s
11:	learn: 0.0636982	total: 2.11s	remaining: 29.2s
12:	learn: 0.0590410	total: 2.28s	remaining: 28.9s
13:	learn: 0.0552678	total: 2.43s	remaining: 28.5s
14:	learn: 0.0517255	total: 2.6s	remaining: 28.3s
15:	learn: 0.0482206	total: 2.78s	remaining: 28.1s
16:	learn: 0.0464051	total: 2.96s	remaining: 28s
17:	learn: 0.0444162	total: 3.13s	remaining: 27.8s
18:	learn: 0.0411498	total: 3.32s	remaining: 27.8s
19:	learn: 0.0386390	total: 3.49s	remaining: 27.5s
20:	learn: 0.0374994	total: 3.66s	remaining: 27.4s
21:	learn: 0.0364728	total: 3.82s	remaining: 27.1s
22:	learn: 0.0347338	total: 3.96s	remaining: 26.7s
23:	learn: 0.0335637	total: 4.05s	remaining: 26s
24:	learn: 0.0322124	total: 4.13s	remaining: 25.3s
25:	learn: 0.0308227	total: 4.23s	remaining: 24.7s
26:	learn: 0.0295926	total: 4.34s	remaining: 24.3s
27:	learn: 0.0283429	total: 4.44s	remaining: 23.8s
28:	learn: 0.0269095	total: 4.53s	remaining: 23.3s
29:	learn: 0.0261233	total: 4.62s	remaining: 22.8s
30:	learn: 0.0252732	total: 4.71s	remaining: 22.3s
31:	learn: 0.0245604	total: 4.79s	remaining: 21.9s
32:	learn: 0.0235486	total: 4.88s	remaining: 21.5s
33:	learn: 0.0226389	total: 4.98s	remaining: 21.1s
34:	learn: 0.0222074	total: 5.08s	remaining: 20.7s
35:	learn: 0.0214293	total: 5.16s	remaining: 20.3s
36:	learn: 0.0205011	total: 5.26s	remaining: 20s
37:	learn: 0.0200770	total: 5.35s	remaining: 19.7s
38:	learn: 0.0194352	total: 5.45s	remaining: 19.4s
39:	learn: 0.0189755	total: 5.54s	remaining: 19.1s
40:	learn: 0.0186278	total: 5.62s	remaining: 18.8s
41:	learn: 0.0180868	total: 5.71s	remaining: 18.5s
42:	learn: 0.0174095	total: 5.81s	remaining: 18.2s
43:	learn: 0.0167335	total: 5.9s	remaining: 18s
44:	learn: 0.0162779	total: 5.97s	remaining: 17.7s
45:	learn: 0.0159384	total: 6.07s	remaining: 17.4s
46:	learn: 0.0155794	total: 6.16s	remaining: 17.2s
47:	learn: 0.0150819	total: 6.24s	remaining: 16.9s
48:	learn: 0.0146251	total: 6.35s	remaining: 16.7s
49:	learn: 0.0143129	total: 6.43s	remaining: 16.5s
50:	learn: 0.0138725	total: 6.53s	remaining: 16.3s
51:	learn: 0.0135637	total: 6.62s	remaining: 16s
52:	learn: 0.0132808	total: 6.71s	remaining: 15.8s
53:	learn: 0.0128049	total: 6.8s	remaining: 15.6s
54:	learn: 0.0125482	total: 6.9s	remaining: 15.4s
55:	learn: 0.0122366	total: 6.99s	remaining: 15.2s
56:	learn: 0.0119265	total: 7.08s	remaining: 15s
57:	learn: 0.0117944	total: 7.17s	remaining: 14.8s
58:	learn: 0.0116211	total: 7.25s	remaining: 14.6s
59:	learn: 0.0112619	total: 7.34s	remaining: 14.4s
60:	learn: 0.0109435	total: 7.45s	remaining: 14.3s
61:	learn: 0.0107622	total: 7.54s	remaining: 14.1s
62:	learn: 0.0105803	total: 7.62s	remaining: 13.9s
63:	learn: 0.0103271	total: 7.72s	remaining: 13.8s
64:	learn: 0.0101250	total: 7.81s	remaining: 13.6s
65:	learn: 0.0099540	total: 7.9s	remaining: 13.4s
66:	learn: 0.0098097	total: 7.99s	remaining: 13.2s
67:	learn: 0.0096243	total: 8.09s	remaining: 13.1s
68:	learn: 0.0095317	total: 8.17s	remaining: 12.9s
69:	learn: 0.0092310	total: 8.27s	remaining: 12.8s
70:	learn: 0.0089875	total: 8.36s	remaining: 12.6s
71:	learn: 0.0088178	total: 8.46s	remaining: 12.5s
72:	learn: 0.0086407	total: 8.55s	remaining: 12.3s
73:	learn: 0.0085691	total: 8.62s	remaining: 12.1s
74:	learn: 0.0083349	total: 8.73s	remaining: 12s
75:	learn: 0.0082118	total: 8.82s	remaining: 11.8s
76:	learn: 0.0081050	total: 8.91s	remaining: 11.7s
77:	learn: 0.0079472	total: 9s	remaining: 11.5s
78:	learn: 0.0078462	total: 9.09s	remaining: 11.4s
79:	learn: 0.0077248	total: 9.17s	remaining: 11.2s
80:	learn: 0.0076005	total: 9.27s	remaining: 11.1s
81:	learn: 0.0075392	total: 9.36s	remaining: 11s
82:	learn: 0.0074333	total: 9.46s	remaining: 10.8s
83:	learn: 0.0072747	total: 9.54s	remaining: 10.7s
84:	learn: 0.0071383	total: 9.64s	remaining: 10.5s
85:	learn: 0.0070686	total: 9.73s	remaining: 10.4s
86:	learn: 0.0069991	total: 9.81s	remaining: 10.3s
87:	learn: 0.0068671	total: 9.91s	remaining: 10.1s
88:	learn: 0.0067417	total: 10s	remaining: 10s
89:	learn: 0.0066430	total: 10.1s	remaining: 9.87s
90:	learn: 0.0065801	total: 10.2s	remaining: 9.72s
91:	learn: 0.0064158	total: 10.3s	remaining: 9.59s
92:	learn: 0.0063422	total: 10.4s	remaining: 9.46s
93:	learn: 0.0063054	total: 10.4s	remaining: 9.33s
94:	learn: 0.0061793	total: 10.5s	remaining: 9.21s
95:	learn: 0.0060583	total: 10.6s	remaining: 9.08s
96:	learn: 0.0060097	total: 10.7s	remaining: 8.96s
97:	learn: 0.0058969	total: 10.8s	remaining: 8.83s
98:	learn: 0.0057885	total: 10.9s	remaining: 8.7s
99:	learn: 0.0057400	total: 11s	remaining: 8.58s
100:	learn: 0.0056144	total: 11.1s	remaining: 8.46s
101:	learn: 0.0055410	total: 11.2s	remaining: 8.34s
102:	learn: 0.0054440	total: 11.3s	remaining: 8.22s
103:	learn: 0.0053931	total: 11.4s	remaining: 8.09s
104:	learn: 0.0053204	total: 11.5s	remaining: 7.96s
105:	learn: 0.0052329	total: 11.6s	remaining: 7.85s
106:	learn: 0.0051442	total: 11.7s	remaining: 7.73s
107:	learn: 0.0050662	total: 11.7s	remaining: 7.61s
108:	learn: 0.0050391	total: 11.8s	remaining: 7.48s
109:	learn: 0.0049871	total: 11.9s	remaining: 7.36s
110:	learn: 0.0049323	total: 12s	remaining: 7.24s
111:	learn: 0.0049003	total: 12.1s	remaining: 7.13s
112:	learn: 0.0048691	total: 12.2s	remaining: 7s
113:	learn: 0.0048491	total: 12.3s	remaining: 6.88s
114:	learn: 0.0047720	total: 12.4s	remaining: 6.77s
115:	learn: 0.0047036	total: 12.5s	remaining: 6.66s
116:	learn: 0.0046475	total: 12.6s	remaining: 6.56s
117:	learn: 0.0045300	total: 12.7s	remaining: 6.45s
118:	learn: 0.0045098	total: 12.8s	remaining: 6.32s
119:	learn: 0.0044304	total: 12.9s	remaining: 6.21s
120:	learn: 0.0043857	total: 12.9s	remaining: 6.09s
121:	learn: 0.0043761	total: 13s	remaining: 5.97s
122:	learn: 0.0042907	total: 13.1s	remaining: 5.87s
123:	learn: 0.0042297	total: 13.2s	remaining: 5.75s
124:	learn: 0.0042039	total: 13.3s	remaining: 5.63s
125:	learn: 0.0041218	total: 13.4s	remaining: 5.53s
126:	learn: 0.0040789	total: 13.5s	remaining: 5.41s
127:	learn: 0.0040552	total: 13.6s	remaining: 5.3s
128:	learn: 0.0040266	total: 13.7s	remaining: 5.19s
129:	learn: 0.0039734	total: 13.8s	remaining: 5.08s
130:	learn: 0.0039430	total: 13.8s	remaining: 4.96s
131:	learn: 0.0039384	total: 13.9s	remaining: 4.86s
132:	learn: 0.0038456	total: 14.1s	remaining: 4.77s
133:	learn: 0.0038140	total: 14.3s	remaining: 4.68s
134:	learn: 0.0037946	total: 14.4s	remaining: 4.59s
135:	learn: 0.0037568	total: 14.6s	remaining: 4.5s
136:	learn: 0.0037220	total: 14.7s	remaining: 4.41s
137:	learn: 0.0036909	total: 14.9s	remaining: 4.32s
138:	learn: 0.0036910	total: 15.1s	remaining: 4.22s
139:	learn: 0.0036828	total: 15.2s	remaining: 4.13s
140:	learn: 0.0036828	total: 15.3s	remaining: 4.02s
141:	learn: 0.0036615	total: 15.5s	remaining: 3.93s
142:	learn: 0.0036615	total: 15.6s	remaining: 3.83s
143:	learn: 0.0036615	total: 15.8s	remaining: 3.72s
144:	learn: 0.0036614	total: 15.9s	remaining: 3.61s
145:	learn: 0.0036051	total: 16.1s	remaining: 3.52s
146:	learn: 0.0035324	total: 16.2s	remaining: 3.42s
147:	learn: 0.0035153	total: 16.4s	remaining: 3.33s
148:	learn: 0.0035153	total: 16.5s	remaining: 3.22s
149:	learn: 0.0035152	total: 16.7s	remaining: 3.12s
150:	learn: 0.0034686	total: 16.9s	remaining: 3.02s
151:	learn: 0.0034144	total: 17.1s	remaining: 2.92s
152:	learn: 0.0033994	total: 17.2s	remaining: 2.81s
153:	learn: 0.0033994	total: 17.3s	remaining: 2.7s
154:	learn: 0.0033995	total: 17.5s	remaining: 2.59s
155:	learn: 0.0033993	total: 17.6s	remaining: 2.48s
156:	learn: 0.0033891	total: 17.8s	remaining: 2.38s
157:	learn: 0.0033890	total: 17.9s	remaining: 2.27s
158:	learn: 0.0033690	total: 18.1s	remaining: 2.16s
159:	learn: 0.0033690	total: 18.2s	remaining: 2.05s
160:	learn: 0.0033690	total: 18.3s	remaining: 1.94s
161:	learn: 0.0033690	total: 18.5s	remaining: 1.82s
162:	learn: 0.0033690	total: 18.6s	remaining: 1.71s
163:	learn: 0.0033690	total: 18.7s	remaining: 1.6s
164:	learn: 0.0033096	total: 18.9s	remaining: 1.49s
165:	learn: 0.0033096	total: 19s	remaining: 1.38s
166:	learn: 0.0032653	total: 19.2s	remaining: 1.26s
167:	learn: 0.0032652	total: 19.3s	remaining: 1.15s
168:	learn: 0.0032652	total: 19.4s	remaining: 1.03s
169:	learn: 0.0032651	total: 19.4s	remaining: 915ms
170:	learn: 0.0032652	total: 19.5s	remaining: 799ms
171:	learn: 0.0032651	total: 19.6s	remaining: 683ms
172:	learn: 0.0032651	total: 19.7s	remaining: 568ms
173:	learn: 0.0032651	total: 19.7s	remaining: 454ms
174:	learn: 0.0032651	total: 19.8s	remaining: 339ms
175:	learn: 0.0032650	total: 19.9s	remaining: 226ms
176:	learn: 0.0032651	total: 20s	remaining: 113ms
177:	learn: 0.0032651	total: 20s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.38
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.69
 - F1-Score_Train: 99.69
 - Precision_Test: 19.24
 - Recall_Test: 88.89
 - AUPRC_Test: 72.98
 - Accuracy_Test: 99.35
 - F1-Score_Test: 31.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4793532	total: 83.3ms	remaining: 14.8s
1:	learn: 0.3044493	total: 175ms	remaining: 15.4s
2:	learn: 0.2039520	total: 273ms	remaining: 15.9s
3:	learn: 0.1535865	total: 367ms	remaining: 16s
4:	learn: 0.1203180	total: 457ms	remaining: 15.8s
5:	learn: 0.1045675	total: 547ms	remaining: 15.7s
6:	learn: 0.0892646	total: 648ms	remaining: 15.8s
7:	learn: 0.0791128	total: 738ms	remaining: 15.7s
8:	learn: 0.0670560	total: 833ms	remaining: 15.6s
9:	learn: 0.0612995	total: 945ms	remaining: 15.9s
10:	learn: 0.0566011	total: 1.03s	remaining: 15.7s
11:	learn: 0.0512134	total: 1.13s	remaining: 15.6s
12:	learn: 0.0465289	total: 1.24s	remaining: 15.7s
13:	learn: 0.0435545	total: 1.33s	remaining: 15.6s
14:	learn: 0.0405365	total: 1.42s	remaining: 15.5s
15:	learn: 0.0384357	total: 1.53s	remaining: 15.5s
16:	learn: 0.0360138	total: 1.62s	remaining: 15.3s
17:	learn: 0.0340857	total: 1.71s	remaining: 15.2s
18:	learn: 0.0321016	total: 1.81s	remaining: 15.1s
19:	learn: 0.0306545	total: 1.91s	remaining: 15.1s
20:	learn: 0.0290596	total: 2.01s	remaining: 15s
21:	learn: 0.0277549	total: 2.11s	remaining: 15s
22:	learn: 0.0267252	total: 2.2s	remaining: 14.8s
23:	learn: 0.0253251	total: 2.29s	remaining: 14.7s
24:	learn: 0.0241534	total: 2.39s	remaining: 14.6s
25:	learn: 0.0228532	total: 2.48s	remaining: 14.5s
26:	learn: 0.0221274	total: 2.57s	remaining: 14.4s
27:	learn: 0.0215332	total: 2.66s	remaining: 14.3s
28:	learn: 0.0207188	total: 2.75s	remaining: 14.1s
29:	learn: 0.0203009	total: 2.83s	remaining: 14s
30:	learn: 0.0195222	total: 2.95s	remaining: 14s
31:	learn: 0.0187692	total: 3.04s	remaining: 13.9s
32:	learn: 0.0182155	total: 3.13s	remaining: 13.8s
33:	learn: 0.0177743	total: 3.23s	remaining: 13.7s
34:	learn: 0.0170028	total: 3.32s	remaining: 13.5s
35:	learn: 0.0165659	total: 3.4s	remaining: 13.4s
36:	learn: 0.0162175	total: 3.5s	remaining: 13.3s
37:	learn: 0.0157904	total: 3.58s	remaining: 13.2s
38:	learn: 0.0153751	total: 3.67s	remaining: 13.1s
39:	learn: 0.0150446	total: 3.76s	remaining: 13s
40:	learn: 0.0144355	total: 3.86s	remaining: 12.9s
41:	learn: 0.0139230	total: 3.98s	remaining: 12.9s
42:	learn: 0.0134468	total: 4.08s	remaining: 12.8s
43:	learn: 0.0132130	total: 4.17s	remaining: 12.7s
44:	learn: 0.0130264	total: 4.25s	remaining: 12.6s
45:	learn: 0.0128149	total: 4.33s	remaining: 12.4s
46:	learn: 0.0126088	total: 4.42s	remaining: 12.3s
47:	learn: 0.0121797	total: 4.52s	remaining: 12.2s
48:	learn: 0.0118581	total: 4.6s	remaining: 12.1s
49:	learn: 0.0115139	total: 4.69s	remaining: 12s
50:	learn: 0.0112892	total: 4.79s	remaining: 11.9s
51:	learn: 0.0111007	total: 4.88s	remaining: 11.8s
52:	learn: 0.0108061	total: 4.96s	remaining: 11.7s
53:	learn: 0.0105515	total: 5.08s	remaining: 11.7s
54:	learn: 0.0104383	total: 5.15s	remaining: 11.5s
55:	learn: 0.0101356	total: 5.24s	remaining: 11.4s
56:	learn: 0.0098843	total: 5.33s	remaining: 11.3s
57:	learn: 0.0095810	total: 5.43s	remaining: 11.2s
58:	learn: 0.0094020	total: 5.51s	remaining: 11.1s
59:	learn: 0.0092891	total: 5.6s	remaining: 11s
60:	learn: 0.0090154	total: 5.7s	remaining: 10.9s
61:	learn: 0.0088232	total: 5.78s	remaining: 10.8s
62:	learn: 0.0086189	total: 5.88s	remaining: 10.7s
63:	learn: 0.0085153	total: 5.96s	remaining: 10.6s
64:	learn: 0.0082931	total: 6.06s	remaining: 10.5s
65:	learn: 0.0081679	total: 6.15s	remaining: 10.4s
66:	learn: 0.0079936	total: 6.25s	remaining: 10.3s
67:	learn: 0.0078599	total: 6.33s	remaining: 10.2s
68:	learn: 0.0076721	total: 6.42s	remaining: 10.1s
69:	learn: 0.0075302	total: 6.51s	remaining: 10s
70:	learn: 0.0073576	total: 6.6s	remaining: 9.94s
71:	learn: 0.0072116	total: 6.69s	remaining: 9.85s
72:	learn: 0.0071556	total: 6.77s	remaining: 9.74s
73:	learn: 0.0070442	total: 6.87s	remaining: 9.65s
74:	learn: 0.0068887	total: 6.96s	remaining: 9.56s
75:	learn: 0.0068371	total: 7.06s	remaining: 9.48s
76:	learn: 0.0066375	total: 7.18s	remaining: 9.42s
77:	learn: 0.0064846	total: 7.34s	remaining: 9.41s
78:	learn: 0.0063276	total: 7.53s	remaining: 9.44s
79:	learn: 0.0062393	total: 7.69s	remaining: 9.43s
80:	learn: 0.0060776	total: 7.87s	remaining: 9.42s
81:	learn: 0.0059487	total: 8.04s	remaining: 9.41s
82:	learn: 0.0058419	total: 8.23s	remaining: 9.42s
83:	learn: 0.0057669	total: 8.41s	remaining: 9.41s
84:	learn: 0.0056611	total: 8.56s	remaining: 9.37s
85:	learn: 0.0055937	total: 8.71s	remaining: 9.32s
86:	learn: 0.0055072	total: 8.89s	remaining: 9.3s
87:	learn: 0.0054256	total: 9.05s	remaining: 9.26s
88:	learn: 0.0052785	total: 9.25s	remaining: 9.25s
89:	learn: 0.0051959	total: 9.42s	remaining: 9.21s
90:	learn: 0.0051025	total: 9.6s	remaining: 9.17s
91:	learn: 0.0050425	total: 9.74s	remaining: 9.11s
92:	learn: 0.0049405	total: 9.94s	remaining: 9.09s
93:	learn: 0.0048531	total: 10.1s	remaining: 9.04s
94:	learn: 0.0047970	total: 10.3s	remaining: 9s
95:	learn: 0.0047509	total: 10.5s	remaining: 8.93s
96:	learn: 0.0046805	total: 10.6s	remaining: 8.87s
97:	learn: 0.0045987	total: 10.8s	remaining: 8.79s
98:	learn: 0.0044985	total: 10.9s	remaining: 8.72s
99:	learn: 0.0044252	total: 11.1s	remaining: 8.67s
100:	learn: 0.0043902	total: 11.2s	remaining: 8.56s
101:	learn: 0.0043008	total: 11.4s	remaining: 8.49s
102:	learn: 0.0042181	total: 11.6s	remaining: 8.44s
103:	learn: 0.0041706	total: 11.8s	remaining: 8.37s
104:	learn: 0.0041480	total: 11.9s	remaining: 8.29s
105:	learn: 0.0040938	total: 12.1s	remaining: 8.21s
106:	learn: 0.0040537	total: 12.2s	remaining: 8.13s
107:	learn: 0.0040355	total: 12.4s	remaining: 8.02s
108:	learn: 0.0039873	total: 12.5s	remaining: 7.9s
109:	learn: 0.0038977	total: 12.6s	remaining: 7.77s
110:	learn: 0.0037997	total: 12.7s	remaining: 7.64s
111:	learn: 0.0037683	total: 12.7s	remaining: 7.51s
112:	learn: 0.0037220	total: 12.8s	remaining: 7.38s
113:	learn: 0.0036514	total: 12.9s	remaining: 7.25s
114:	learn: 0.0036122	total: 13s	remaining: 7.13s
115:	learn: 0.0035302	total: 13.1s	remaining: 7s
116:	learn: 0.0035302	total: 13.2s	remaining: 6.87s
117:	learn: 0.0035302	total: 13.3s	remaining: 6.74s
118:	learn: 0.0034846	total: 13.4s	remaining: 6.63s
119:	learn: 0.0034451	total: 13.4s	remaining: 6.5s
120:	learn: 0.0033876	total: 13.5s	remaining: 6.38s
121:	learn: 0.0033759	total: 13.6s	remaining: 6.25s
122:	learn: 0.0033758	total: 13.7s	remaining: 6.12s
123:	learn: 0.0033046	total: 13.8s	remaining: 6.01s
124:	learn: 0.0032797	total: 13.9s	remaining: 5.88s
125:	learn: 0.0032430	total: 14s	remaining: 5.76s
126:	learn: 0.0032429	total: 14s	remaining: 5.64s
127:	learn: 0.0031999	total: 14.1s	remaining: 5.52s
128:	learn: 0.0031620	total: 14.2s	remaining: 5.4s
129:	learn: 0.0031348	total: 14.3s	remaining: 5.28s
130:	learn: 0.0031034	total: 14.4s	remaining: 5.17s
131:	learn: 0.0030666	total: 14.5s	remaining: 5.05s
132:	learn: 0.0030666	total: 14.6s	remaining: 4.93s
133:	learn: 0.0030666	total: 14.6s	remaining: 4.8s
134:	learn: 0.0030423	total: 14.7s	remaining: 4.69s
135:	learn: 0.0029943	total: 14.8s	remaining: 4.58s
136:	learn: 0.0029449	total: 14.9s	remaining: 4.46s
137:	learn: 0.0029143	total: 15s	remaining: 4.35s
138:	learn: 0.0028911	total: 15.1s	remaining: 4.23s
139:	learn: 0.0028460	total: 15.2s	remaining: 4.12s
140:	learn: 0.0028218	total: 15.3s	remaining: 4.01s
141:	learn: 0.0027830	total: 15.4s	remaining: 3.9s
142:	learn: 0.0027587	total: 15.5s	remaining: 3.79s
143:	learn: 0.0027038	total: 15.6s	remaining: 3.67s
144:	learn: 0.0026784	total: 15.7s	remaining: 3.56s
145:	learn: 0.0026784	total: 15.7s	remaining: 3.45s
146:	learn: 0.0026610	total: 15.8s	remaining: 3.33s
147:	learn: 0.0026610	total: 15.9s	remaining: 3.22s
148:	learn: 0.0026117	total: 16s	remaining: 3.11s
149:	learn: 0.0025924	total: 16.1s	remaining: 3s
150:	learn: 0.0025622	total: 16.2s	remaining: 2.89s
151:	learn: 0.0025279	total: 16.2s	remaining: 2.78s
152:	learn: 0.0025279	total: 16.3s	remaining: 2.66s
153:	learn: 0.0025279	total: 16.4s	remaining: 2.55s
154:	learn: 0.0025026	total: 16.5s	remaining: 2.45s
155:	learn: 0.0024967	total: 16.6s	remaining: 2.33s
156:	learn: 0.0024804	total: 16.7s	remaining: 2.23s
157:	learn: 0.0024484	total: 16.7s	remaining: 2.12s
158:	learn: 0.0024484	total: 16.8s	remaining: 2.01s
159:	learn: 0.0024484	total: 16.9s	remaining: 1.9s
160:	learn: 0.0024484	total: 17s	remaining: 1.79s
161:	learn: 0.0024484	total: 17s	remaining: 1.68s
162:	learn: 0.0024484	total: 17.1s	remaining: 1.57s
163:	learn: 0.0024484	total: 17.2s	remaining: 1.47s
164:	learn: 0.0024484	total: 17.2s	remaining: 1.36s
165:	learn: 0.0024318	total: 17.3s	remaining: 1.25s
166:	learn: 0.0023969	total: 17.4s	remaining: 1.15s
167:	learn: 0.0023740	total: 17.5s	remaining: 1.04s
168:	learn: 0.0023740	total: 17.6s	remaining: 938ms
169:	learn: 0.0023506	total: 17.7s	remaining: 833ms
170:	learn: 0.0023184	total: 17.8s	remaining: 728ms
171:	learn: 0.0023183	total: 17.9s	remaining: 624ms
172:	learn: 0.0023183	total: 17.9s	remaining: 519ms
173:	learn: 0.0023183	total: 18s	remaining: 414ms
174:	learn: 0.0023183	total: 18.1s	remaining: 310ms
175:	learn: 0.0023183	total: 18.2s	remaining: 207ms
176:	learn: 0.0023183	total: 18.2s	remaining: 103ms
177:	learn: 0.0023183	total: 18.3s	remaining: 0us
[I 2024-12-19 14:12:47,050] Trial 4 finished with value: 76.71888513333191 and parameters: {'learning_rate': 0.09180002730546635, 'max_depth': 6, 'n_estimators': 178, 'scale_pos_weight': 11.21811114252904}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.56
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.78
 - F1-Score_Train: 99.78
 - Precision_Test: 21.28
 - Recall_Test: 87.30
 - AUPRC_Test: 79.09
 - Accuracy_Test: 99.44
 - F1-Score_Test: 34.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 76.7189

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5626407	total: 108ms	remaining: 19.9s
1:	learn: 0.4479412	total: 201ms	remaining: 18.4s
2:	learn: 0.3468387	total: 286ms	remaining: 17.4s
3:	learn: 0.2662888	total: 397ms	remaining: 17.9s
4:	learn: 0.2112297	total: 493ms	remaining: 17.7s
5:	learn: 0.1839046	total: 584ms	remaining: 17.4s
6:	learn: 0.1489903	total: 688ms	remaining: 17.5s
7:	learn: 0.1305856	total: 777ms	remaining: 17.2s
8:	learn: 0.1143243	total: 870ms	remaining: 17s
9:	learn: 0.1040520	total: 970ms	remaining: 17s
10:	learn: 0.0899326	total: 1.07s	remaining: 17s
11:	learn: 0.0791111	total: 1.19s	remaining: 17.2s
12:	learn: 0.0737743	total: 1.28s	remaining: 17s
13:	learn: 0.0668069	total: 1.39s	remaining: 16.9s
14:	learn: 0.0602393	total: 1.49s	remaining: 16.9s
15:	learn: 0.0568244	total: 1.58s	remaining: 16.7s
16:	learn: 0.0524460	total: 1.66s	remaining: 16.4s
17:	learn: 0.0487888	total: 1.77s	remaining: 16.4s
18:	learn: 0.0463002	total: 1.87s	remaining: 16.4s
19:	learn: 0.0434877	total: 2.02s	remaining: 16.7s
20:	learn: 0.0406625	total: 2.2s	remaining: 17.2s
21:	learn: 0.0390280	total: 2.35s	remaining: 17.4s
22:	learn: 0.0372158	total: 2.53s	remaining: 17.8s
23:	learn: 0.0359309	total: 2.7s	remaining: 18.1s
24:	learn: 0.0345184	total: 2.86s	remaining: 18.3s
25:	learn: 0.0333811	total: 3.03s	remaining: 18.5s
26:	learn: 0.0320404	total: 3.22s	remaining: 18.8s
27:	learn: 0.0305868	total: 3.41s	remaining: 19.1s
28:	learn: 0.0294406	total: 3.56s	remaining: 19.2s
29:	learn: 0.0285342	total: 3.75s	remaining: 19.4s
30:	learn: 0.0276867	total: 3.9s	remaining: 19.4s
31:	learn: 0.0264819	total: 4.08s	remaining: 19.5s
32:	learn: 0.0259372	total: 4.24s	remaining: 19.5s
33:	learn: 0.0253944	total: 4.42s	remaining: 19.6s
34:	learn: 0.0248680	total: 4.57s	remaining: 19.6s
35:	learn: 0.0240117	total: 4.77s	remaining: 19.7s
36:	learn: 0.0233881	total: 4.94s	remaining: 19.7s
37:	learn: 0.0227343	total: 5.11s	remaining: 19.8s
38:	learn: 0.0220461	total: 5.28s	remaining: 19.8s
39:	learn: 0.0212760	total: 5.48s	remaining: 19.9s
40:	learn: 0.0208241	total: 5.64s	remaining: 19.8s
41:	learn: 0.0203696	total: 5.8s	remaining: 19.8s
42:	learn: 0.0199223	total: 5.96s	remaining: 19.7s
43:	learn: 0.0194531	total: 6.13s	remaining: 19.7s
44:	learn: 0.0190779	total: 6.3s	remaining: 19.6s
45:	learn: 0.0187065	total: 6.49s	remaining: 19.6s
46:	learn: 0.0184534	total: 6.66s	remaining: 19.5s
47:	learn: 0.0180741	total: 6.84s	remaining: 19.5s
48:	learn: 0.0177868	total: 6.95s	remaining: 19.3s
49:	learn: 0.0173699	total: 7.04s	remaining: 19s
50:	learn: 0.0170343	total: 7.13s	remaining: 18.7s
51:	learn: 0.0166480	total: 7.22s	remaining: 18.5s
52:	learn: 0.0163191	total: 7.31s	remaining: 18.2s
53:	learn: 0.0160575	total: 7.41s	remaining: 18s
54:	learn: 0.0157430	total: 7.51s	remaining: 17.8s
55:	learn: 0.0154831	total: 7.6s	remaining: 17.5s
56:	learn: 0.0152271	total: 7.69s	remaining: 17.3s
57:	learn: 0.0149251	total: 7.8s	remaining: 17.1s
58:	learn: 0.0146316	total: 7.89s	remaining: 16.8s
59:	learn: 0.0142475	total: 7.99s	remaining: 16.7s
60:	learn: 0.0139490	total: 8.07s	remaining: 16.4s
61:	learn: 0.0138184	total: 8.16s	remaining: 16.2s
62:	learn: 0.0135526	total: 8.26s	remaining: 16s
63:	learn: 0.0132672	total: 8.35s	remaining: 15.8s
64:	learn: 0.0129561	total: 8.43s	remaining: 15.6s
65:	learn: 0.0127195	total: 8.55s	remaining: 15.4s
66:	learn: 0.0125381	total: 8.65s	remaining: 15.2s
67:	learn: 0.0122832	total: 8.74s	remaining: 15s
68:	learn: 0.0120867	total: 8.84s	remaining: 14.9s
69:	learn: 0.0118805	total: 8.93s	remaining: 14.7s
70:	learn: 0.0116661	total: 9.02s	remaining: 14.5s
71:	learn: 0.0113892	total: 9.12s	remaining: 14.3s
72:	learn: 0.0112068	total: 9.21s	remaining: 14.1s
73:	learn: 0.0109382	total: 9.31s	remaining: 14s
74:	learn: 0.0107348	total: 9.41s	remaining: 13.8s
75:	learn: 0.0105299	total: 9.52s	remaining: 13.6s
76:	learn: 0.0103831	total: 9.61s	remaining: 13.5s
77:	learn: 0.0102487	total: 9.7s	remaining: 13.3s
78:	learn: 0.0100371	total: 9.8s	remaining: 13.2s
79:	learn: 0.0099074	total: 9.88s	remaining: 13s
80:	learn: 0.0097634	total: 9.98s	remaining: 12.8s
81:	learn: 0.0095708	total: 10.1s	remaining: 12.7s
82:	learn: 0.0094025	total: 10.2s	remaining: 12.5s
83:	learn: 0.0092594	total: 10.3s	remaining: 12.3s
84:	learn: 0.0091227	total: 10.4s	remaining: 12.2s
85:	learn: 0.0090260	total: 10.4s	remaining: 12s
86:	learn: 0.0089147	total: 10.5s	remaining: 11.9s
87:	learn: 0.0087570	total: 10.6s	remaining: 11.7s
88:	learn: 0.0086541	total: 10.7s	remaining: 11.6s
89:	learn: 0.0085903	total: 10.8s	remaining: 11.4s
90:	learn: 0.0084602	total: 10.9s	remaining: 11.3s
91:	learn: 0.0083613	total: 11s	remaining: 11.1s
92:	learn: 0.0081714	total: 11.1s	remaining: 11s
93:	learn: 0.0080859	total: 11.2s	remaining: 10.8s
94:	learn: 0.0079912	total: 11.3s	remaining: 10.7s
95:	learn: 0.0079068	total: 11.4s	remaining: 10.5s
96:	learn: 0.0078296	total: 11.5s	remaining: 10.4s
97:	learn: 0.0077514	total: 11.5s	remaining: 10.2s
98:	learn: 0.0076820	total: 11.7s	remaining: 10.1s
99:	learn: 0.0074998	total: 11.8s	remaining: 10s
100:	learn: 0.0074163	total: 11.9s	remaining: 9.86s
101:	learn: 0.0073516	total: 12s	remaining: 9.73s
102:	learn: 0.0073233	total: 12s	remaining: 9.59s
103:	learn: 0.0072209	total: 12.1s	remaining: 9.45s
104:	learn: 0.0071218	total: 12.2s	remaining: 9.31s
105:	learn: 0.0070412	total: 12.3s	remaining: 9.17s
106:	learn: 0.0069634	total: 12.4s	remaining: 9.04s
107:	learn: 0.0068629	total: 12.5s	remaining: 8.91s
108:	learn: 0.0067827	total: 12.6s	remaining: 8.77s
109:	learn: 0.0067392	total: 12.7s	remaining: 8.64s
110:	learn: 0.0066666	total: 12.8s	remaining: 8.51s
111:	learn: 0.0065961	total: 12.9s	remaining: 8.38s
112:	learn: 0.0065049	total: 12.9s	remaining: 8.25s
113:	learn: 0.0064417	total: 13s	remaining: 8.12s
114:	learn: 0.0063606	total: 13.1s	remaining: 7.99s
115:	learn: 0.0063031	total: 13.2s	remaining: 7.85s
116:	learn: 0.0062337	total: 13.3s	remaining: 7.74s
117:	learn: 0.0061879	total: 13.4s	remaining: 7.6s
118:	learn: 0.0061461	total: 13.5s	remaining: 7.47s
119:	learn: 0.0060928	total: 13.6s	remaining: 7.34s
120:	learn: 0.0060688	total: 13.7s	remaining: 7.22s
121:	learn: 0.0060055	total: 13.7s	remaining: 7.1s
122:	learn: 0.0059464	total: 13.8s	remaining: 6.98s
123:	learn: 0.0059046	total: 13.9s	remaining: 6.85s
124:	learn: 0.0058446	total: 14s	remaining: 6.72s
125:	learn: 0.0057600	total: 14.1s	remaining: 6.61s
126:	learn: 0.0056922	total: 14.2s	remaining: 6.48s
127:	learn: 0.0056285	total: 14.3s	remaining: 6.36s
128:	learn: 0.0055585	total: 14.4s	remaining: 6.24s
129:	learn: 0.0055072	total: 14.5s	remaining: 6.12s
130:	learn: 0.0053837	total: 14.6s	remaining: 6s
131:	learn: 0.0053466	total: 14.7s	remaining: 5.89s
132:	learn: 0.0052605	total: 14.8s	remaining: 5.77s
133:	learn: 0.0051752	total: 14.9s	remaining: 5.66s
134:	learn: 0.0050604	total: 15s	remaining: 5.54s
135:	learn: 0.0050064	total: 15.1s	remaining: 5.42s
136:	learn: 0.0049786	total: 15.1s	remaining: 5.3s
137:	learn: 0.0049294	total: 15.2s	remaining: 5.19s
138:	learn: 0.0048711	total: 15.3s	remaining: 5.08s
139:	learn: 0.0048131	total: 15.4s	remaining: 4.96s
140:	learn: 0.0047705	total: 15.5s	remaining: 4.85s
141:	learn: 0.0047407	total: 15.6s	remaining: 4.73s
142:	learn: 0.0046903	total: 15.7s	remaining: 4.62s
143:	learn: 0.0046415	total: 15.8s	remaining: 4.5s
144:	learn: 0.0045820	total: 15.9s	remaining: 4.39s
145:	learn: 0.0045334	total: 16s	remaining: 4.28s
146:	learn: 0.0045062	total: 16.1s	remaining: 4.16s
147:	learn: 0.0044842	total: 16.2s	remaining: 4.04s
148:	learn: 0.0044450	total: 16.3s	remaining: 3.93s
149:	learn: 0.0044124	total: 16.4s	remaining: 3.82s
150:	learn: 0.0043798	total: 16.4s	remaining: 3.7s
151:	learn: 0.0043357	total: 16.5s	remaining: 3.59s
152:	learn: 0.0042703	total: 16.6s	remaining: 3.48s
153:	learn: 0.0042037	total: 16.7s	remaining: 3.37s
154:	learn: 0.0041484	total: 16.8s	remaining: 3.26s
155:	learn: 0.0041291	total: 16.9s	remaining: 3.15s
156:	learn: 0.0041050	total: 17.1s	remaining: 3.05s
157:	learn: 0.0040276	total: 17.3s	remaining: 2.95s
158:	learn: 0.0040021	total: 17.4s	remaining: 2.85s
159:	learn: 0.0039909	total: 17.6s	remaining: 2.75s
160:	learn: 0.0039477	total: 17.8s	remaining: 2.65s
161:	learn: 0.0038977	total: 18s	remaining: 2.55s
162:	learn: 0.0038762	total: 18.1s	remaining: 2.45s
163:	learn: 0.0038225	total: 18.3s	remaining: 2.34s
164:	learn: 0.0038032	total: 18.5s	remaining: 2.24s
165:	learn: 0.0037971	total: 18.6s	remaining: 2.13s
166:	learn: 0.0037790	total: 18.8s	remaining: 2.02s
167:	learn: 0.0037468	total: 18.9s	remaining: 1.92s
168:	learn: 0.0037079	total: 19.1s	remaining: 1.81s
169:	learn: 0.0036613	total: 19.3s	remaining: 1.7s
170:	learn: 0.0036322	total: 19.5s	remaining: 1.59s
171:	learn: 0.0036065	total: 19.7s	remaining: 1.49s
172:	learn: 0.0035828	total: 19.8s	remaining: 1.38s
173:	learn: 0.0035655	total: 20s	remaining: 1.26s
174:	learn: 0.0035655	total: 20.1s	remaining: 1.15s
175:	learn: 0.0035471	total: 20.3s	remaining: 1.04s
176:	learn: 0.0035153	total: 20.5s	remaining: 925ms
177:	learn: 0.0034641	total: 20.6s	remaining: 812ms
178:	learn: 0.0034260	total: 20.8s	remaining: 698ms
179:	learn: 0.0034027	total: 21s	remaining: 583ms
180:	learn: 0.0033880	total: 21.1s	remaining: 467ms
181:	learn: 0.0033701	total: 21.3s	remaining: 351ms
182:	learn: 0.0033571	total: 21.5s	remaining: 235ms
183:	learn: 0.0033305	total: 21.6s	remaining: 118ms
184:	learn: 0.0032992	total: 21.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.13
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.56
 - F1-Score_Train: 99.57
 - Precision_Test: 12.92
 - Recall_Test: 88.89
 - AUPRC_Test: 75.69
 - Accuracy_Test: 98.97
 - F1-Score_Test: 22.56
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 185
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.38
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5556976	total: 85.9ms	remaining: 15.8s
1:	learn: 0.4490215	total: 188ms	remaining: 17.2s
2:	learn: 0.3730075	total: 276ms	remaining: 16.7s
3:	learn: 0.3056643	total: 370ms	remaining: 16.7s
4:	learn: 0.2513444	total: 479ms	remaining: 17.2s
5:	learn: 0.2265902	total: 558ms	remaining: 16.6s
6:	learn: 0.2015552	total: 654ms	remaining: 16.6s
7:	learn: 0.1649524	total: 747ms	remaining: 16.5s
8:	learn: 0.1445993	total: 849ms	remaining: 16.6s
9:	learn: 0.1328449	total: 956ms	remaining: 16.7s
10:	learn: 0.1220739	total: 1.05s	remaining: 16.6s
11:	learn: 0.1138248	total: 1.13s	remaining: 16.3s
12:	learn: 0.1073371	total: 1.24s	remaining: 16.4s
13:	learn: 0.0935759	total: 1.33s	remaining: 16.3s
14:	learn: 0.0887872	total: 1.43s	remaining: 16.2s
15:	learn: 0.0803982	total: 1.53s	remaining: 16.1s
16:	learn: 0.0728378	total: 1.63s	remaining: 16.1s
17:	learn: 0.0679271	total: 1.73s	remaining: 16.1s
18:	learn: 0.0643657	total: 1.83s	remaining: 16s
19:	learn: 0.0605198	total: 1.92s	remaining: 15.9s
20:	learn: 0.0580910	total: 2.02s	remaining: 15.8s
21:	learn: 0.0561309	total: 2.11s	remaining: 15.6s
22:	learn: 0.0540818	total: 2.2s	remaining: 15.5s
23:	learn: 0.0511796	total: 2.31s	remaining: 15.5s
24:	learn: 0.0489845	total: 2.4s	remaining: 15.4s
25:	learn: 0.0468897	total: 2.5s	remaining: 15.3s
26:	learn: 0.0452633	total: 2.61s	remaining: 15.3s
27:	learn: 0.0429208	total: 2.7s	remaining: 15.2s
28:	learn: 0.0410708	total: 2.8s	remaining: 15.1s
29:	learn: 0.0397770	total: 2.9s	remaining: 15s
30:	learn: 0.0390143	total: 2.99s	remaining: 14.8s
31:	learn: 0.0380636	total: 3.08s	remaining: 14.7s
32:	learn: 0.0371426	total: 3.17s	remaining: 14.6s
33:	learn: 0.0357337	total: 3.27s	remaining: 14.5s
34:	learn: 0.0348617	total: 3.37s	remaining: 14.4s
35:	learn: 0.0340386	total: 3.47s	remaining: 14.4s
36:	learn: 0.0329723	total: 3.56s	remaining: 14.3s
37:	learn: 0.0319673	total: 3.65s	remaining: 14.1s
38:	learn: 0.0311885	total: 3.76s	remaining: 14.1s
39:	learn: 0.0303172	total: 3.85s	remaining: 14s
40:	learn: 0.0295764	total: 3.94s	remaining: 13.9s
41:	learn: 0.0284397	total: 4.05s	remaining: 13.8s
42:	learn: 0.0278714	total: 4.15s	remaining: 13.7s
43:	learn: 0.0274413	total: 4.24s	remaining: 13.6s
44:	learn: 0.0267581	total: 4.36s	remaining: 13.6s
45:	learn: 0.0263416	total: 4.45s	remaining: 13.4s
46:	learn: 0.0257017	total: 4.54s	remaining: 13.3s
47:	learn: 0.0253523	total: 4.63s	remaining: 13.2s
48:	learn: 0.0249584	total: 4.72s	remaining: 13.1s
49:	learn: 0.0245645	total: 4.81s	remaining: 13s
50:	learn: 0.0241204	total: 4.91s	remaining: 12.9s
51:	learn: 0.0236320	total: 5s	remaining: 12.8s
52:	learn: 0.0231033	total: 5.09s	remaining: 12.7s
53:	learn: 0.0226753	total: 5.2s	remaining: 12.6s
54:	learn: 0.0221841	total: 5.28s	remaining: 12.5s
55:	learn: 0.0218452	total: 5.39s	remaining: 12.4s
56:	learn: 0.0211512	total: 5.49s	remaining: 12.3s
57:	learn: 0.0206763	total: 5.58s	remaining: 12.2s
58:	learn: 0.0204427	total: 5.66s	remaining: 12.1s
59:	learn: 0.0200763	total: 5.75s	remaining: 12s
60:	learn: 0.0198752	total: 5.83s	remaining: 11.9s
61:	learn: 0.0194043	total: 5.93s	remaining: 11.8s
62:	learn: 0.0192236	total: 6.03s	remaining: 11.7s
63:	learn: 0.0188340	total: 6.13s	remaining: 11.6s
64:	learn: 0.0186045	total: 6.22s	remaining: 11.5s
65:	learn: 0.0183583	total: 6.32s	remaining: 11.4s
66:	learn: 0.0180292	total: 6.43s	remaining: 11.3s
67:	learn: 0.0178265	total: 6.51s	remaining: 11.2s
68:	learn: 0.0175266	total: 6.61s	remaining: 11.1s
69:	learn: 0.0173640	total: 6.68s	remaining: 11s
70:	learn: 0.0168953	total: 6.78s	remaining: 10.9s
71:	learn: 0.0165733	total: 6.88s	remaining: 10.8s
72:	learn: 0.0162860	total: 6.97s	remaining: 10.7s
73:	learn: 0.0160167	total: 7.06s	remaining: 10.6s
74:	learn: 0.0158514	total: 7.15s	remaining: 10.5s
75:	learn: 0.0154873	total: 7.24s	remaining: 10.4s
76:	learn: 0.0152438	total: 7.34s	remaining: 10.3s
77:	learn: 0.0150698	total: 7.45s	remaining: 10.2s
78:	learn: 0.0148577	total: 7.54s	remaining: 10.1s
79:	learn: 0.0147089	total: 7.62s	remaining: 10s
80:	learn: 0.0146217	total: 7.72s	remaining: 9.91s
81:	learn: 0.0144951	total: 7.8s	remaining: 9.8s
82:	learn: 0.0142323	total: 7.92s	remaining: 9.73s
83:	learn: 0.0139898	total: 8.08s	remaining: 9.71s
84:	learn: 0.0138182	total: 8.22s	remaining: 9.67s
85:	learn: 0.0136033	total: 8.41s	remaining: 9.68s
86:	learn: 0.0133641	total: 8.6s	remaining: 9.69s
87:	learn: 0.0131479	total: 8.79s	remaining: 9.69s
88:	learn: 0.0130003	total: 8.95s	remaining: 9.66s
89:	learn: 0.0128147	total: 9.14s	remaining: 9.65s
90:	learn: 0.0127029	total: 9.28s	remaining: 9.59s
91:	learn: 0.0125461	total: 9.46s	remaining: 9.56s
92:	learn: 0.0124299	total: 9.63s	remaining: 9.52s
93:	learn: 0.0123599	total: 9.78s	remaining: 9.46s
94:	learn: 0.0122548	total: 9.93s	remaining: 9.41s
95:	learn: 0.0121323	total: 10.1s	remaining: 9.32s
96:	learn: 0.0119827	total: 10.2s	remaining: 9.26s
97:	learn: 0.0118197	total: 10.4s	remaining: 9.2s
98:	learn: 0.0116029	total: 10.6s	remaining: 9.17s
99:	learn: 0.0114828	total: 10.7s	remaining: 9.11s
100:	learn: 0.0113645	total: 10.9s	remaining: 9.06s
101:	learn: 0.0112607	total: 11.1s	remaining: 9.01s
102:	learn: 0.0111198	total: 11.2s	remaining: 8.95s
103:	learn: 0.0109650	total: 11.4s	remaining: 8.9s
104:	learn: 0.0108112	total: 11.6s	remaining: 8.84s
105:	learn: 0.0107007	total: 11.8s	remaining: 8.78s
106:	learn: 0.0105519	total: 12s	remaining: 8.71s
107:	learn: 0.0104429	total: 12.1s	remaining: 8.65s
108:	learn: 0.0103651	total: 12.3s	remaining: 8.58s
109:	learn: 0.0102113	total: 12.5s	remaining: 8.51s
110:	learn: 0.0101242	total: 12.6s	remaining: 8.43s
111:	learn: 0.0100413	total: 12.8s	remaining: 8.34s
112:	learn: 0.0099022	total: 13s	remaining: 8.26s
113:	learn: 0.0097498	total: 13.2s	remaining: 8.19s
114:	learn: 0.0096527	total: 13.3s	remaining: 8.1s
115:	learn: 0.0095291	total: 13.4s	remaining: 7.98s
116:	learn: 0.0094091	total: 13.5s	remaining: 7.84s
117:	learn: 0.0093553	total: 13.6s	remaining: 7.71s
118:	learn: 0.0092559	total: 13.7s	remaining: 7.59s
119:	learn: 0.0090554	total: 13.8s	remaining: 7.47s
120:	learn: 0.0089214	total: 13.9s	remaining: 7.34s
121:	learn: 0.0088504	total: 14s	remaining: 7.22s
122:	learn: 0.0087583	total: 14.1s	remaining: 7.08s
123:	learn: 0.0086995	total: 14.1s	remaining: 6.96s
124:	learn: 0.0085442	total: 14.2s	remaining: 6.83s
125:	learn: 0.0084638	total: 14.3s	remaining: 6.71s
126:	learn: 0.0083886	total: 14.4s	remaining: 6.58s
127:	learn: 0.0083000	total: 14.5s	remaining: 6.46s
128:	learn: 0.0082242	total: 14.6s	remaining: 6.33s
129:	learn: 0.0081609	total: 14.7s	remaining: 6.21s
130:	learn: 0.0080881	total: 14.8s	remaining: 6.1s
131:	learn: 0.0080132	total: 14.9s	remaining: 5.97s
132:	learn: 0.0078656	total: 15s	remaining: 5.85s
133:	learn: 0.0078175	total: 15.1s	remaining: 5.73s
134:	learn: 0.0077212	total: 15.1s	remaining: 5.61s
135:	learn: 0.0076865	total: 15.2s	remaining: 5.49s
136:	learn: 0.0075782	total: 15.3s	remaining: 5.37s
137:	learn: 0.0075116	total: 15.4s	remaining: 5.25s
138:	learn: 0.0074589	total: 15.5s	remaining: 5.13s
139:	learn: 0.0073209	total: 15.6s	remaining: 5.01s
140:	learn: 0.0072568	total: 15.7s	remaining: 4.89s
141:	learn: 0.0071803	total: 15.8s	remaining: 4.78s
142:	learn: 0.0071285	total: 15.9s	remaining: 4.66s
143:	learn: 0.0070854	total: 15.9s	remaining: 4.54s
144:	learn: 0.0070009	total: 16s	remaining: 4.43s
145:	learn: 0.0069356	total: 16.1s	remaining: 4.31s
146:	learn: 0.0068386	total: 16.2s	remaining: 4.2s
147:	learn: 0.0067593	total: 16.3s	remaining: 4.08s
148:	learn: 0.0066929	total: 16.4s	remaining: 3.97s
149:	learn: 0.0066046	total: 16.5s	remaining: 3.85s
150:	learn: 0.0065346	total: 16.6s	remaining: 3.74s
151:	learn: 0.0065054	total: 16.7s	remaining: 3.63s
152:	learn: 0.0064167	total: 16.8s	remaining: 3.51s
153:	learn: 0.0063697	total: 16.9s	remaining: 3.4s
154:	learn: 0.0063237	total: 17s	remaining: 3.29s
155:	learn: 0.0062284	total: 17.1s	remaining: 3.17s
156:	learn: 0.0061812	total: 17.2s	remaining: 3.06s
157:	learn: 0.0061454	total: 17.2s	remaining: 2.95s
158:	learn: 0.0061173	total: 17.3s	remaining: 2.83s
159:	learn: 0.0060426	total: 17.4s	remaining: 2.72s
160:	learn: 0.0059956	total: 17.5s	remaining: 2.61s
161:	learn: 0.0059608	total: 17.6s	remaining: 2.5s
162:	learn: 0.0059126	total: 17.7s	remaining: 2.38s
163:	learn: 0.0058818	total: 17.8s	remaining: 2.27s
164:	learn: 0.0058386	total: 17.9s	remaining: 2.16s
165:	learn: 0.0057889	total: 18s	remaining: 2.05s
166:	learn: 0.0057182	total: 18s	remaining: 1.94s
167:	learn: 0.0056362	total: 18.1s	remaining: 1.83s
168:	learn: 0.0055704	total: 18.2s	remaining: 1.72s
169:	learn: 0.0055261	total: 18.3s	remaining: 1.61s
170:	learn: 0.0054886	total: 18.4s	remaining: 1.51s
171:	learn: 0.0054509	total: 18.5s	remaining: 1.4s
172:	learn: 0.0053998	total: 18.6s	remaining: 1.29s
173:	learn: 0.0053860	total: 18.7s	remaining: 1.18s
174:	learn: 0.0053589	total: 18.7s	remaining: 1.07s
175:	learn: 0.0053367	total: 18.8s	remaining: 964ms
176:	learn: 0.0053104	total: 18.9s	remaining: 856ms
177:	learn: 0.0052534	total: 19s	remaining: 748ms
178:	learn: 0.0051928	total: 19.1s	remaining: 641ms
179:	learn: 0.0051735	total: 19.2s	remaining: 534ms
180:	learn: 0.0051537	total: 19.3s	remaining: 426ms
181:	learn: 0.0050850	total: 19.4s	remaining: 320ms
182:	learn: 0.0050850	total: 19.5s	remaining: 213ms
183:	learn: 0.0050313	total: 19.5s	remaining: 106ms
184:	learn: 0.0049807	total: 19.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.84
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.41
 - F1-Score_Train: 99.42
 - Precision_Test: 10.83
 - Recall_Test: 90.48
 - AUPRC_Test: 65.02
 - Accuracy_Test: 98.73
 - F1-Score_Test: 19.34
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 185
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.38
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5622818	total: 81ms	remaining: 14.9s
1:	learn: 0.4309370	total: 172ms	remaining: 15.8s
2:	learn: 0.3451481	total: 287ms	remaining: 17.4s
3:	learn: 0.2777314	total: 388ms	remaining: 17.5s
4:	learn: 0.2197595	total: 480ms	remaining: 17.3s
5:	learn: 0.1797786	total: 579ms	remaining: 17.3s
6:	learn: 0.1493524	total: 678ms	remaining: 17.2s
7:	learn: 0.1302513	total: 767ms	remaining: 17s
8:	learn: 0.1164816	total: 865ms	remaining: 16.9s
9:	learn: 0.1022141	total: 951ms	remaining: 16.6s
10:	learn: 0.0941701	total: 1.04s	remaining: 16.4s
11:	learn: 0.0871725	total: 1.14s	remaining: 16.4s
12:	learn: 0.0793805	total: 1.22s	remaining: 16.2s
13:	learn: 0.0742644	total: 1.33s	remaining: 16.3s
14:	learn: 0.0687408	total: 1.43s	remaining: 16.3s
15:	learn: 0.0651264	total: 1.51s	remaining: 16s
16:	learn: 0.0616429	total: 1.64s	remaining: 16.2s
17:	learn: 0.0573635	total: 1.8s	remaining: 16.7s
18:	learn: 0.0543143	total: 1.98s	remaining: 17.3s
19:	learn: 0.0523906	total: 2.14s	remaining: 17.6s
20:	learn: 0.0499107	total: 2.31s	remaining: 18.1s
21:	learn: 0.0476117	total: 2.5s	remaining: 18.5s
22:	learn: 0.0452734	total: 2.69s	remaining: 18.9s
23:	learn: 0.0435943	total: 2.87s	remaining: 19.3s
24:	learn: 0.0416916	total: 3.06s	remaining: 19.6s
25:	learn: 0.0398379	total: 3.23s	remaining: 19.7s
26:	learn: 0.0382470	total: 3.4s	remaining: 19.9s
27:	learn: 0.0368947	total: 3.56s	remaining: 20s
28:	learn: 0.0356485	total: 3.76s	remaining: 20.2s
29:	learn: 0.0345232	total: 3.94s	remaining: 20.3s
30:	learn: 0.0332786	total: 4.11s	remaining: 20.4s
31:	learn: 0.0324644	total: 4.3s	remaining: 20.5s
32:	learn: 0.0315316	total: 4.47s	remaining: 20.6s
33:	learn: 0.0307166	total: 4.65s	remaining: 20.6s
34:	learn: 0.0296288	total: 4.82s	remaining: 20.6s
35:	learn: 0.0289344	total: 4.99s	remaining: 20.6s
36:	learn: 0.0281408	total: 5.16s	remaining: 20.6s
37:	learn: 0.0275175	total: 5.3s	remaining: 20.5s
38:	learn: 0.0266064	total: 5.49s	remaining: 20.5s
39:	learn: 0.0257759	total: 5.67s	remaining: 20.6s
40:	learn: 0.0252910	total: 5.83s	remaining: 20.5s
41:	learn: 0.0248087	total: 6s	remaining: 20.4s
42:	learn: 0.0243919	total: 6.18s	remaining: 20.4s
43:	learn: 0.0237575	total: 6.35s	remaining: 20.4s
44:	learn: 0.0233095	total: 6.5s	remaining: 20.2s
45:	learn: 0.0229144	total: 6.59s	remaining: 19.9s
46:	learn: 0.0224776	total: 6.68s	remaining: 19.6s
47:	learn: 0.0221428	total: 6.78s	remaining: 19.4s
48:	learn: 0.0216727	total: 6.87s	remaining: 19.1s
49:	learn: 0.0212288	total: 6.96s	remaining: 18.8s
50:	learn: 0.0207758	total: 7.05s	remaining: 18.5s
51:	learn: 0.0202698	total: 7.14s	remaining: 18.3s
52:	learn: 0.0199150	total: 7.23s	remaining: 18s
53:	learn: 0.0194759	total: 7.33s	remaining: 17.8s
54:	learn: 0.0191478	total: 7.42s	remaining: 17.5s
55:	learn: 0.0186886	total: 7.51s	remaining: 17.3s
56:	learn: 0.0183478	total: 7.64s	remaining: 17.1s
57:	learn: 0.0180382	total: 7.73s	remaining: 16.9s
58:	learn: 0.0178395	total: 7.8s	remaining: 16.7s
59:	learn: 0.0174101	total: 7.91s	remaining: 16.5s
60:	learn: 0.0169930	total: 8s	remaining: 16.3s
61:	learn: 0.0167616	total: 8.08s	remaining: 16s
62:	learn: 0.0165066	total: 8.19s	remaining: 15.9s
63:	learn: 0.0161530	total: 8.28s	remaining: 15.6s
64:	learn: 0.0158173	total: 8.37s	remaining: 15.5s
65:	learn: 0.0155967	total: 8.47s	remaining: 15.3s
66:	learn: 0.0154170	total: 8.55s	remaining: 15.1s
67:	learn: 0.0152458	total: 8.66s	remaining: 14.9s
68:	learn: 0.0150721	total: 8.76s	remaining: 14.7s
69:	learn: 0.0147922	total: 8.86s	remaining: 14.6s
70:	learn: 0.0144977	total: 8.96s	remaining: 14.4s
71:	learn: 0.0142659	total: 9.07s	remaining: 14.2s
72:	learn: 0.0141296	total: 9.15s	remaining: 14s
73:	learn: 0.0138407	total: 9.23s	remaining: 13.9s
74:	learn: 0.0136375	total: 9.33s	remaining: 13.7s
75:	learn: 0.0133773	total: 9.42s	remaining: 13.5s
76:	learn: 0.0132770	total: 9.51s	remaining: 13.3s
77:	learn: 0.0131693	total: 9.59s	remaining: 13.2s
78:	learn: 0.0129548	total: 9.7s	remaining: 13s
79:	learn: 0.0127389	total: 9.8s	remaining: 12.9s
80:	learn: 0.0126037	total: 9.9s	remaining: 12.7s
81:	learn: 0.0124256	total: 9.98s	remaining: 12.5s
82:	learn: 0.0122751	total: 10.1s	remaining: 12.4s
83:	learn: 0.0121538	total: 10.2s	remaining: 12.2s
84:	learn: 0.0119633	total: 10.3s	remaining: 12.1s
85:	learn: 0.0118418	total: 10.4s	remaining: 11.9s
86:	learn: 0.0116531	total: 10.4s	remaining: 11.8s
87:	learn: 0.0114619	total: 10.5s	remaining: 11.6s
88:	learn: 0.0112774	total: 10.6s	remaining: 11.5s
89:	learn: 0.0111126	total: 10.7s	remaining: 11.3s
90:	learn: 0.0109912	total: 10.8s	remaining: 11.2s
91:	learn: 0.0108879	total: 10.9s	remaining: 11.1s
92:	learn: 0.0107800	total: 11s	remaining: 10.9s
93:	learn: 0.0106572	total: 11.1s	remaining: 10.7s
94:	learn: 0.0105083	total: 11.2s	remaining: 10.6s
95:	learn: 0.0103435	total: 11.3s	remaining: 10.5s
96:	learn: 0.0102070	total: 11.4s	remaining: 10.3s
97:	learn: 0.0100621	total: 11.5s	remaining: 10.2s
98:	learn: 0.0099711	total: 11.6s	remaining: 10.1s
99:	learn: 0.0098744	total: 11.7s	remaining: 9.92s
100:	learn: 0.0097420	total: 11.8s	remaining: 9.79s
101:	learn: 0.0095643	total: 11.9s	remaining: 9.65s
102:	learn: 0.0094213	total: 12s	remaining: 9.51s
103:	learn: 0.0092982	total: 12.1s	remaining: 9.39s
104:	learn: 0.0092426	total: 12.1s	remaining: 9.24s
105:	learn: 0.0091256	total: 12.2s	remaining: 9.11s
106:	learn: 0.0090069	total: 12.3s	remaining: 8.98s
107:	learn: 0.0088182	total: 12.4s	remaining: 8.85s
108:	learn: 0.0087761	total: 12.5s	remaining: 8.71s
109:	learn: 0.0086835	total: 12.6s	remaining: 8.58s
110:	learn: 0.0085504	total: 12.7s	remaining: 8.46s
111:	learn: 0.0084867	total: 12.8s	remaining: 8.33s
112:	learn: 0.0083819	total: 12.9s	remaining: 8.21s
113:	learn: 0.0082423	total: 13s	remaining: 8.08s
114:	learn: 0.0081300	total: 13.1s	remaining: 7.95s
115:	learn: 0.0080086	total: 13.2s	remaining: 7.83s
116:	learn: 0.0079153	total: 13.3s	remaining: 7.71s
117:	learn: 0.0078116	total: 13.4s	remaining: 7.58s
118:	learn: 0.0077539	total: 13.4s	remaining: 7.46s
119:	learn: 0.0077130	total: 13.5s	remaining: 7.33s
120:	learn: 0.0076075	total: 13.6s	remaining: 7.21s
121:	learn: 0.0074918	total: 13.7s	remaining: 7.09s
122:	learn: 0.0074170	total: 13.8s	remaining: 6.97s
123:	learn: 0.0073467	total: 13.9s	remaining: 6.84s
124:	learn: 0.0073085	total: 14s	remaining: 6.72s
125:	learn: 0.0072204	total: 14.1s	remaining: 6.6s
126:	learn: 0.0071678	total: 14.2s	remaining: 6.47s
127:	learn: 0.0070853	total: 14.3s	remaining: 6.35s
128:	learn: 0.0070055	total: 14.4s	remaining: 6.25s
129:	learn: 0.0069725	total: 14.5s	remaining: 6.12s
130:	learn: 0.0068905	total: 14.6s	remaining: 6s
131:	learn: 0.0068490	total: 14.6s	remaining: 5.88s
132:	learn: 0.0068099	total: 14.7s	remaining: 5.76s
133:	learn: 0.0067406	total: 14.8s	remaining: 5.65s
134:	learn: 0.0066616	total: 14.9s	remaining: 5.53s
135:	learn: 0.0065962	total: 15s	remaining: 5.41s
136:	learn: 0.0065187	total: 15.1s	remaining: 5.3s
137:	learn: 0.0064300	total: 15.2s	remaining: 5.18s
138:	learn: 0.0063184	total: 15.3s	remaining: 5.07s
139:	learn: 0.0062552	total: 15.4s	remaining: 4.95s
140:	learn: 0.0061653	total: 15.5s	remaining: 4.83s
141:	learn: 0.0061282	total: 15.6s	remaining: 4.71s
142:	learn: 0.0060739	total: 15.7s	remaining: 4.6s
143:	learn: 0.0060534	total: 15.8s	remaining: 4.49s
144:	learn: 0.0059420	total: 15.9s	remaining: 4.38s
145:	learn: 0.0058948	total: 16s	remaining: 4.26s
146:	learn: 0.0058265	total: 16s	remaining: 4.15s
147:	learn: 0.0058096	total: 16.1s	remaining: 4.03s
148:	learn: 0.0057216	total: 16.2s	remaining: 3.92s
149:	learn: 0.0056785	total: 16.3s	remaining: 3.81s
150:	learn: 0.0056120	total: 16.4s	remaining: 3.69s
151:	learn: 0.0055804	total: 16.6s	remaining: 3.59s
152:	learn: 0.0055581	total: 16.7s	remaining: 3.49s
153:	learn: 0.0054933	total: 16.9s	remaining: 3.39s
154:	learn: 0.0054682	total: 17s	remaining: 3.29s
155:	learn: 0.0054136	total: 17.2s	remaining: 3.2s
156:	learn: 0.0053986	total: 17.4s	remaining: 3.1s
157:	learn: 0.0053439	total: 17.6s	remaining: 3s
158:	learn: 0.0052976	total: 17.7s	remaining: 2.9s
159:	learn: 0.0052537	total: 17.9s	remaining: 2.8s
160:	learn: 0.0051657	total: 18.1s	remaining: 2.7s
161:	learn: 0.0051132	total: 18.3s	remaining: 2.59s
162:	learn: 0.0050999	total: 18.4s	remaining: 2.49s
163:	learn: 0.0050662	total: 18.6s	remaining: 2.38s
164:	learn: 0.0050207	total: 18.8s	remaining: 2.27s
165:	learn: 0.0049629	total: 18.9s	remaining: 2.17s
166:	learn: 0.0049261	total: 19.1s	remaining: 2.06s
167:	learn: 0.0048855	total: 19.3s	remaining: 1.95s
168:	learn: 0.0048583	total: 19.5s	remaining: 1.84s
169:	learn: 0.0047895	total: 19.6s	remaining: 1.73s
170:	learn: 0.0047676	total: 19.8s	remaining: 1.62s
171:	learn: 0.0046979	total: 20s	remaining: 1.51s
172:	learn: 0.0046528	total: 20.1s	remaining: 1.4s
173:	learn: 0.0045979	total: 20.3s	remaining: 1.28s
174:	learn: 0.0045182	total: 20.5s	remaining: 1.17s
175:	learn: 0.0044823	total: 20.7s	remaining: 1.06s
176:	learn: 0.0044656	total: 20.9s	remaining: 942ms
177:	learn: 0.0044295	total: 21s	remaining: 827ms
178:	learn: 0.0043661	total: 21.2s	remaining: 711ms
179:	learn: 0.0043315	total: 21.4s	remaining: 594ms
180:	learn: 0.0043036	total: 21.5s	remaining: 476ms
181:	learn: 0.0042765	total: 21.7s	remaining: 358ms
182:	learn: 0.0042514	total: 21.8s	remaining: 238ms
183:	learn: 0.0042110	total: 21.9s	remaining: 119ms
184:	learn: 0.0041933	total: 22s	remaining: 0us
[I 2024-12-19 14:13:57,003] Trial 5 finished with value: 72.089155485785 and parameters: {'learning_rate': 0.050682454663907056, 'max_depth': 6, 'n_estimators': 185, 'scale_pos_weight': 13.37714459399436}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.89
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.44
 - F1-Score_Train: 99.44
 - Precision_Test: 11.07
 - Recall_Test: 87.30
 - AUPRC_Test: 75.56
 - Accuracy_Test: 98.80
 - F1-Score_Test: 19.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 185
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 13.38
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 72.0892

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6683742	total: 69.1ms	remaining: 18.9s
1:	learn: 0.6424842	total: 155ms	remaining: 21.2s
2:	learn: 0.6174096	total: 226ms	remaining: 20.5s
3:	learn: 0.5894397	total: 310ms	remaining: 21s
4:	learn: 0.5623418	total: 384ms	remaining: 20.7s
5:	learn: 0.5426693	total: 461ms	remaining: 20.7s
6:	learn: 0.5221119	total: 544ms	remaining: 20.8s
7:	learn: 0.5048724	total: 619ms	remaining: 20.6s
8:	learn: 0.4847866	total: 692ms	remaining: 20.5s
9:	learn: 0.4658513	total: 773ms	remaining: 20.5s
10:	learn: 0.4483619	total: 852ms	remaining: 20.4s
11:	learn: 0.4341930	total: 926ms	remaining: 20.3s
12:	learn: 0.4200808	total: 1.01s	remaining: 20.3s
13:	learn: 0.4039568	total: 1.08s	remaining: 20.2s
14:	learn: 0.3912261	total: 1.17s	remaining: 20.3s
15:	learn: 0.3783649	total: 1.24s	remaining: 20.1s
16:	learn: 0.3662548	total: 1.31s	remaining: 20s
17:	learn: 0.3539099	total: 1.39s	remaining: 19.9s
18:	learn: 0.3412631	total: 1.49s	remaining: 20s
19:	learn: 0.3290620	total: 1.56s	remaining: 19.9s
20:	learn: 0.3169022	total: 1.64s	remaining: 19.8s
21:	learn: 0.3075457	total: 1.72s	remaining: 19.8s
22:	learn: 0.2985791	total: 1.8s	remaining: 19.7s
23:	learn: 0.2896132	total: 1.88s	remaining: 19.7s
24:	learn: 0.2789086	total: 1.96s	remaining: 19.6s
25:	learn: 0.2710828	total: 2.03s	remaining: 19.5s
26:	learn: 0.2626065	total: 2.11s	remaining: 19.4s
27:	learn: 0.2558809	total: 2.21s	remaining: 19.5s
28:	learn: 0.2492936	total: 2.29s	remaining: 19.4s
29:	learn: 0.2419033	total: 2.36s	remaining: 19.3s
30:	learn: 0.2354344	total: 2.45s	remaining: 19.3s
31:	learn: 0.2284050	total: 2.53s	remaining: 19.2s
32:	learn: 0.2215907	total: 2.6s	remaining: 19.1s
33:	learn: 0.2154273	total: 2.69s	remaining: 19.1s
34:	learn: 0.2106444	total: 2.76s	remaining: 18.9s
35:	learn: 0.2047874	total: 2.83s	remaining: 18.8s
36:	learn: 0.1997195	total: 2.91s	remaining: 18.7s
37:	learn: 0.1946051	total: 2.98s	remaining: 18.6s
38:	learn: 0.1896665	total: 3.06s	remaining: 18.5s
39:	learn: 0.1848004	total: 3.14s	remaining: 18.4s
40:	learn: 0.1801772	total: 3.22s	remaining: 18.4s
41:	learn: 0.1778619	total: 3.29s	remaining: 18.3s
42:	learn: 0.1749124	total: 3.37s	remaining: 18.2s
43:	learn: 0.1710593	total: 3.45s	remaining: 18.1s
44:	learn: 0.1663316	total: 3.52s	remaining: 18s
45:	learn: 0.1644243	total: 3.6s	remaining: 17.9s
46:	learn: 0.1598219	total: 3.68s	remaining: 17.8s
47:	learn: 0.1567679	total: 3.75s	remaining: 17.8s
48:	learn: 0.1535783	total: 3.84s	remaining: 17.7s
49:	learn: 0.1503072	total: 3.92s	remaining: 17.7s
50:	learn: 0.1482728	total: 3.99s	remaining: 17.5s
51:	learn: 0.1453186	total: 4.07s	remaining: 17.5s
52:	learn: 0.1437543	total: 4.14s	remaining: 17.4s
53:	learn: 0.1419265	total: 4.23s	remaining: 17.3s
54:	learn: 0.1393438	total: 4.31s	remaining: 17.3s
55:	learn: 0.1372731	total: 4.38s	remaining: 17.1s
56:	learn: 0.1358050	total: 4.46s	remaining: 17.1s
57:	learn: 0.1339885	total: 4.54s	remaining: 17s
58:	learn: 0.1315733	total: 4.62s	remaining: 16.9s
59:	learn: 0.1296502	total: 4.69s	remaining: 16.8s
60:	learn: 0.1276259	total: 4.77s	remaining: 16.7s
61:	learn: 0.1258666	total: 4.84s	remaining: 16.6s
62:	learn: 0.1237246	total: 4.91s	remaining: 16.5s
63:	learn: 0.1225850	total: 4.99s	remaining: 16.5s
64:	learn: 0.1205648	total: 5.07s	remaining: 16.4s
65:	learn: 0.1190614	total: 5.13s	remaining: 16.3s
66:	learn: 0.1171581	total: 5.22s	remaining: 16.2s
67:	learn: 0.1160859	total: 5.31s	remaining: 16.2s
68:	learn: 0.1147249	total: 5.38s	remaining: 16.1s
69:	learn: 0.1132076	total: 5.47s	remaining: 16s
70:	learn: 0.1120554	total: 5.54s	remaining: 15.9s
71:	learn: 0.1111123	total: 5.62s	remaining: 15.8s
72:	learn: 0.1097982	total: 5.69s	remaining: 15.8s
73:	learn: 0.1090796	total: 5.76s	remaining: 15.6s
74:	learn: 0.1081914	total: 5.83s	remaining: 15.5s
75:	learn: 0.1069016	total: 5.9s	remaining: 15.5s
76:	learn: 0.1053632	total: 5.98s	remaining: 15.4s
77:	learn: 0.1043796	total: 6.05s	remaining: 15.3s
78:	learn: 0.1035254	total: 6.13s	remaining: 15.2s
79:	learn: 0.1021954	total: 6.21s	remaining: 15.1s
80:	learn: 0.1006549	total: 6.3s	remaining: 15.1s
81:	learn: 0.1000392	total: 6.38s	remaining: 15s
82:	learn: 0.0992746	total: 6.46s	remaining: 14.9s
83:	learn: 0.0985785	total: 6.53s	remaining: 14.8s
84:	learn: 0.0979126	total: 6.61s	remaining: 14.8s
85:	learn: 0.0969705	total: 6.68s	remaining: 14.7s
86:	learn: 0.0964226	total: 6.74s	remaining: 14.6s
87:	learn: 0.0955087	total: 6.83s	remaining: 14.5s
88:	learn: 0.0949136	total: 6.9s	remaining: 14.4s
89:	learn: 0.0941965	total: 6.97s	remaining: 14.3s
90:	learn: 0.0931659	total: 7.05s	remaining: 14.3s
91:	learn: 0.0923968	total: 7.12s	remaining: 14.2s
92:	learn: 0.0916414	total: 7.19s	remaining: 14.1s
93:	learn: 0.0909548	total: 7.27s	remaining: 14s
94:	learn: 0.0902704	total: 7.36s	remaining: 13.9s
95:	learn: 0.0896487	total: 7.44s	remaining: 13.9s
96:	learn: 0.0889697	total: 7.52s	remaining: 13.8s
97:	learn: 0.0883556	total: 7.59s	remaining: 13.7s
98:	learn: 0.0878531	total: 7.68s	remaining: 13.7s
99:	learn: 0.0871529	total: 7.82s	remaining: 13.7s
100:	learn: 0.0864220	total: 7.95s	remaining: 13.7s
101:	learn: 0.0855427	total: 8.1s	remaining: 13.7s
102:	learn: 0.0846048	total: 8.25s	remaining: 13.8s
103:	learn: 0.0838212	total: 8.38s	remaining: 13.8s
104:	learn: 0.0831639	total: 8.54s	remaining: 13.8s
105:	learn: 0.0827813	total: 8.68s	remaining: 13.8s
106:	learn: 0.0822349	total: 8.83s	remaining: 13.9s
107:	learn: 0.0818208	total: 8.98s	remaining: 13.9s
108:	learn: 0.0814442	total: 9.11s	remaining: 13.9s
109:	learn: 0.0809314	total: 9.25s	remaining: 13.9s
110:	learn: 0.0804615	total: 9.38s	remaining: 13.9s
111:	learn: 0.0796319	total: 9.51s	remaining: 13.8s
112:	learn: 0.0789394	total: 9.61s	remaining: 13.8s
113:	learn: 0.0785485	total: 9.75s	remaining: 13.8s
114:	learn: 0.0781359	total: 9.87s	remaining: 13.7s
115:	learn: 0.0777147	total: 10s	remaining: 13.7s
116:	learn: 0.0769536	total: 10.2s	remaining: 13.7s
117:	learn: 0.0766369	total: 10.3s	remaining: 13.7s
118:	learn: 0.0763276	total: 10.4s	remaining: 13.7s
119:	learn: 0.0757540	total: 10.6s	remaining: 13.7s
120:	learn: 0.0754016	total: 10.7s	remaining: 13.6s
121:	learn: 0.0746766	total: 10.9s	remaining: 13.6s
122:	learn: 0.0740776	total: 11s	remaining: 13.6s
123:	learn: 0.0736862	total: 11.2s	remaining: 13.6s
124:	learn: 0.0733631	total: 11.3s	remaining: 13.5s
125:	learn: 0.0730236	total: 11.4s	remaining: 13.5s
126:	learn: 0.0724966	total: 11.5s	remaining: 13.4s
127:	learn: 0.0720433	total: 11.7s	remaining: 13.4s
128:	learn: 0.0717411	total: 11.8s	remaining: 13.4s
129:	learn: 0.0712113	total: 11.9s	remaining: 13.3s
130:	learn: 0.0706660	total: 12.1s	remaining: 13.3s
131:	learn: 0.0701984	total: 12.2s	remaining: 13.3s
132:	learn: 0.0696150	total: 12.4s	remaining: 13.2s
133:	learn: 0.0693410	total: 12.5s	remaining: 13.2s
134:	learn: 0.0689072	total: 12.6s	remaining: 13.1s
135:	learn: 0.0681738	total: 12.8s	remaining: 13.1s
136:	learn: 0.0676935	total: 12.9s	remaining: 13s
137:	learn: 0.0671577	total: 13.1s	remaining: 13s
138:	learn: 0.0668508	total: 13.2s	remaining: 12.9s
139:	learn: 0.0665707	total: 13.2s	remaining: 12.8s
140:	learn: 0.0661395	total: 13.3s	remaining: 12.6s
141:	learn: 0.0658465	total: 13.4s	remaining: 12.5s
142:	learn: 0.0655067	total: 13.5s	remaining: 12.4s
143:	learn: 0.0651187	total: 13.5s	remaining: 12.3s
144:	learn: 0.0648561	total: 13.6s	remaining: 12.2s
145:	learn: 0.0645468	total: 13.7s	remaining: 12.1s
146:	learn: 0.0642805	total: 13.8s	remaining: 12s
147:	learn: 0.0638487	total: 13.8s	remaining: 11.9s
148:	learn: 0.0634928	total: 13.9s	remaining: 11.8s
149:	learn: 0.0631531	total: 14s	remaining: 11.7s
150:	learn: 0.0625542	total: 14.1s	remaining: 11.6s
151:	learn: 0.0622908	total: 14.2s	remaining: 11.4s
152:	learn: 0.0619260	total: 14.2s	remaining: 11.3s
153:	learn: 0.0615359	total: 14.3s	remaining: 11.2s
154:	learn: 0.0613544	total: 14.4s	remaining: 11.1s
155:	learn: 0.0610811	total: 14.5s	remaining: 11s
156:	learn: 0.0609120	total: 14.5s	remaining: 10.9s
157:	learn: 0.0606481	total: 14.6s	remaining: 10.8s
158:	learn: 0.0603089	total: 14.7s	remaining: 10.7s
159:	learn: 0.0600809	total: 14.8s	remaining: 10.6s
160:	learn: 0.0597893	total: 14.8s	remaining: 10.5s
161:	learn: 0.0592798	total: 14.9s	remaining: 10.4s
162:	learn: 0.0589730	total: 15s	remaining: 10.3s
163:	learn: 0.0587296	total: 15.1s	remaining: 10.2s
164:	learn: 0.0584181	total: 15.1s	remaining: 10.1s
165:	learn: 0.0581379	total: 15.2s	remaining: 9.99s
166:	learn: 0.0576596	total: 15.3s	remaining: 9.89s
167:	learn: 0.0574343	total: 15.4s	remaining: 9.79s
168:	learn: 0.0571624	total: 15.4s	remaining: 9.69s
169:	learn: 0.0569343	total: 15.5s	remaining: 9.58s
170:	learn: 0.0567084	total: 15.6s	remaining: 9.48s
171:	learn: 0.0565060	total: 15.7s	remaining: 9.39s
172:	learn: 0.0562068	total: 15.8s	remaining: 9.29s
173:	learn: 0.0560261	total: 15.8s	remaining: 9.19s
174:	learn: 0.0557323	total: 15.9s	remaining: 9.09s
175:	learn: 0.0555211	total: 16s	remaining: 8.99s
176:	learn: 0.0551800	total: 16.1s	remaining: 8.89s
177:	learn: 0.0548716	total: 16.1s	remaining: 8.79s
178:	learn: 0.0546463	total: 16.2s	remaining: 8.69s
179:	learn: 0.0543697	total: 16.3s	remaining: 8.59s
180:	learn: 0.0540338	total: 16.4s	remaining: 8.5s
181:	learn: 0.0537393	total: 16.4s	remaining: 8.4s
182:	learn: 0.0535471	total: 16.5s	remaining: 8.31s
183:	learn: 0.0532973	total: 16.6s	remaining: 8.21s
184:	learn: 0.0530070	total: 16.7s	remaining: 8.11s
185:	learn: 0.0527551	total: 16.8s	remaining: 8.03s
186:	learn: 0.0525205	total: 16.9s	remaining: 7.93s
187:	learn: 0.0522881	total: 16.9s	remaining: 7.84s
188:	learn: 0.0520223	total: 17s	remaining: 7.74s
189:	learn: 0.0517990	total: 17.1s	remaining: 7.64s
190:	learn: 0.0514623	total: 17.2s	remaining: 7.54s
191:	learn: 0.0511935	total: 17.2s	remaining: 7.45s
192:	learn: 0.0509724	total: 17.3s	remaining: 7.35s
193:	learn: 0.0507800	total: 17.4s	remaining: 7.26s
194:	learn: 0.0505515	total: 17.5s	remaining: 7.16s
195:	learn: 0.0503689	total: 17.5s	remaining: 7.07s
196:	learn: 0.0501617	total: 17.6s	remaining: 6.97s
197:	learn: 0.0499244	total: 17.7s	remaining: 6.88s
198:	learn: 0.0497393	total: 17.8s	remaining: 6.79s
199:	learn: 0.0495239	total: 17.8s	remaining: 6.69s
200:	learn: 0.0493756	total: 17.9s	remaining: 6.59s
201:	learn: 0.0491842	total: 18s	remaining: 6.5s
202:	learn: 0.0490357	total: 18s	remaining: 6.4s
203:	learn: 0.0488877	total: 18.1s	remaining: 6.3s
204:	learn: 0.0487056	total: 18.2s	remaining: 6.21s
205:	learn: 0.0484767	total: 18.3s	remaining: 6.12s
206:	learn: 0.0482702	total: 18.4s	remaining: 6.03s
207:	learn: 0.0481082	total: 18.4s	remaining: 5.94s
208:	learn: 0.0479356	total: 18.5s	remaining: 5.84s
209:	learn: 0.0476954	total: 18.6s	remaining: 5.75s
210:	learn: 0.0474752	total: 18.7s	remaining: 5.66s
211:	learn: 0.0472857	total: 18.7s	remaining: 5.56s
212:	learn: 0.0469887	total: 18.8s	remaining: 5.48s
213:	learn: 0.0468364	total: 18.9s	remaining: 5.38s
214:	learn: 0.0466592	total: 19s	remaining: 5.29s
215:	learn: 0.0464476	total: 19s	remaining: 5.2s
216:	learn: 0.0462642	total: 19.1s	remaining: 5.1s
217:	learn: 0.0460207	total: 19.2s	remaining: 5.01s
218:	learn: 0.0458938	total: 19.2s	remaining: 4.92s
219:	learn: 0.0457187	total: 19.3s	remaining: 4.83s
220:	learn: 0.0455517	total: 19.4s	remaining: 4.74s
221:	learn: 0.0454333	total: 19.5s	remaining: 4.65s
222:	learn: 0.0452920	total: 19.5s	remaining: 4.56s
223:	learn: 0.0450807	total: 19.6s	remaining: 4.47s
224:	learn: 0.0449098	total: 19.7s	remaining: 4.38s
225:	learn: 0.0447480	total: 19.8s	remaining: 4.29s
226:	learn: 0.0445809	total: 19.9s	remaining: 4.2s
227:	learn: 0.0444085	total: 19.9s	remaining: 4.11s
228:	learn: 0.0442360	total: 20s	remaining: 4.02s
229:	learn: 0.0440750	total: 20.1s	remaining: 3.93s
230:	learn: 0.0439311	total: 20.2s	remaining: 3.84s
231:	learn: 0.0437645	total: 20.2s	remaining: 3.75s
232:	learn: 0.0436387	total: 20.3s	remaining: 3.66s
233:	learn: 0.0434899	total: 20.4s	remaining: 3.57s
234:	learn: 0.0432905	total: 20.5s	remaining: 3.48s
235:	learn: 0.0431156	total: 20.5s	remaining: 3.39s
236:	learn: 0.0429209	total: 20.6s	remaining: 3.31s
237:	learn: 0.0427872	total: 20.7s	remaining: 3.21s
238:	learn: 0.0426736	total: 20.8s	remaining: 3.13s
239:	learn: 0.0425491	total: 20.8s	remaining: 3.04s
240:	learn: 0.0424411	total: 20.9s	remaining: 2.95s
241:	learn: 0.0423024	total: 21s	remaining: 2.86s
242:	learn: 0.0421677	total: 21.1s	remaining: 2.77s
243:	learn: 0.0420467	total: 21.1s	remaining: 2.69s
244:	learn: 0.0419504	total: 21.2s	remaining: 2.6s
245:	learn: 0.0418265	total: 21.3s	remaining: 2.51s
246:	learn: 0.0417112	total: 21.4s	remaining: 2.42s
247:	learn: 0.0415661	total: 21.4s	remaining: 2.33s
248:	learn: 0.0414021	total: 21.5s	remaining: 2.25s
249:	learn: 0.0412952	total: 21.6s	remaining: 2.16s
250:	learn: 0.0411517	total: 21.7s	remaining: 2.07s
251:	learn: 0.0410143	total: 21.8s	remaining: 1.99s
252:	learn: 0.0408964	total: 21.8s	remaining: 1.9s
253:	learn: 0.0407889	total: 21.9s	remaining: 1.81s
254:	learn: 0.0406490	total: 22s	remaining: 1.72s
255:	learn: 0.0405641	total: 22.1s	remaining: 1.64s
256:	learn: 0.0404594	total: 22.1s	remaining: 1.55s
257:	learn: 0.0403273	total: 22.2s	remaining: 1.46s
258:	learn: 0.0401619	total: 22.3s	remaining: 1.38s
259:	learn: 0.0400416	total: 22.4s	remaining: 1.29s
260:	learn: 0.0399207	total: 22.5s	remaining: 1.2s
261:	learn: 0.0397909	total: 22.5s	remaining: 1.12s
262:	learn: 0.0396951	total: 22.6s	remaining: 1.03s
263:	learn: 0.0395684	total: 22.7s	remaining: 945ms
264:	learn: 0.0394577	total: 22.7s	remaining: 858ms
265:	learn: 0.0393083	total: 22.8s	remaining: 772ms
266:	learn: 0.0391576	total: 22.9s	remaining: 686ms
267:	learn: 0.0390448	total: 23s	remaining: 600ms
268:	learn: 0.0389301	total: 23s	remaining: 514ms
269:	learn: 0.0387914	total: 23.2s	remaining: 429ms
270:	learn: 0.0386740	total: 23.3s	remaining: 344ms
271:	learn: 0.0385662	total: 23.4s	remaining: 258ms
272:	learn: 0.0384223	total: 23.6s	remaining: 173ms
273:	learn: 0.0383169	total: 23.7s	remaining: 86.4ms
274:	learn: 0.0382136	total: 23.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.75
 - Recall_Train: 99.97
 - AUPRC_Train: 99.83
 - Accuracy_Train: 97.21
 - F1-Score_Train: 97.29
 - Precision_Test: 2.64
 - Recall_Test: 91.27
 - AUPRC_Test: 66.55
 - Accuracy_Test: 94.31
 - F1-Score_Test: 5.12
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6702020	total: 126ms	remaining: 34.4s
1:	learn: 0.6475992	total: 255ms	remaining: 34.7s
2:	learn: 0.6254154	total: 402ms	remaining: 36.5s
3:	learn: 0.6056358	total: 537ms	remaining: 36.4s
4:	learn: 0.5833777	total: 684ms	remaining: 37s
5:	learn: 0.5564772	total: 826ms	remaining: 37s
6:	learn: 0.5378195	total: 984ms	remaining: 37.7s
7:	learn: 0.5203342	total: 1.11s	remaining: 37.2s
8:	learn: 0.5017506	total: 1.27s	remaining: 37.7s
9:	learn: 0.4828414	total: 1.41s	remaining: 37.4s
10:	learn: 0.4677159	total: 1.49s	remaining: 35.8s
11:	learn: 0.4519325	total: 1.56s	remaining: 34.2s
12:	learn: 0.4380916	total: 1.63s	remaining: 32.9s
13:	learn: 0.4248237	total: 1.72s	remaining: 32s
14:	learn: 0.4110178	total: 1.78s	remaining: 30.9s
15:	learn: 0.3978005	total: 1.86s	remaining: 30.1s
16:	learn: 0.3843703	total: 1.94s	remaining: 29.4s
17:	learn: 0.3730267	total: 2.01s	remaining: 28.7s
18:	learn: 0.3624749	total: 2.09s	remaining: 28.2s
19:	learn: 0.3507895	total: 2.18s	remaining: 27.8s
20:	learn: 0.3405644	total: 2.26s	remaining: 27.4s
21:	learn: 0.3316243	total: 2.33s	remaining: 26.8s
22:	learn: 0.3213732	total: 2.42s	remaining: 26.5s
23:	learn: 0.3146256	total: 2.49s	remaining: 26s
24:	learn: 0.3057708	total: 2.57s	remaining: 25.7s
25:	learn: 0.2989822	total: 2.65s	remaining: 25.4s
26:	learn: 0.2913173	total: 2.72s	remaining: 24.9s
27:	learn: 0.2845812	total: 2.79s	remaining: 24.6s
28:	learn: 0.2783901	total: 2.88s	remaining: 24.4s
29:	learn: 0.2719200	total: 2.95s	remaining: 24.1s
30:	learn: 0.2654975	total: 3.02s	remaining: 23.8s
31:	learn: 0.2597343	total: 3.12s	remaining: 23.7s
32:	learn: 0.2538689	total: 3.21s	remaining: 23.5s
33:	learn: 0.2480944	total: 3.28s	remaining: 23.2s
34:	learn: 0.2432283	total: 3.36s	remaining: 23.1s
35:	learn: 0.2381064	total: 3.44s	remaining: 22.8s
36:	learn: 0.2337338	total: 3.51s	remaining: 22.6s
37:	learn: 0.2291364	total: 3.6s	remaining: 22.4s
38:	learn: 0.2245377	total: 3.66s	remaining: 22.2s
39:	learn: 0.2204301	total: 3.74s	remaining: 22s
40:	learn: 0.2172957	total: 3.82s	remaining: 21.8s
41:	learn: 0.2137737	total: 3.9s	remaining: 21.6s
42:	learn: 0.2099982	total: 3.96s	remaining: 21.4s
43:	learn: 0.2063102	total: 4.04s	remaining: 21.2s
44:	learn: 0.2021964	total: 4.13s	remaining: 21.1s
45:	learn: 0.1985312	total: 4.21s	remaining: 21s
46:	learn: 0.1953836	total: 4.29s	remaining: 20.8s
47:	learn: 0.1929161	total: 4.37s	remaining: 20.7s
48:	learn: 0.1903964	total: 4.44s	remaining: 20.5s
49:	learn: 0.1876138	total: 4.51s	remaining: 20.3s
50:	learn: 0.1838346	total: 4.59s	remaining: 20.2s
51:	learn: 0.1810961	total: 4.67s	remaining: 20s
52:	learn: 0.1784368	total: 4.75s	remaining: 19.9s
53:	learn: 0.1770914	total: 4.81s	remaining: 19.7s
54:	learn: 0.1751687	total: 4.88s	remaining: 19.5s
55:	learn: 0.1725376	total: 4.96s	remaining: 19.4s
56:	learn: 0.1710184	total: 5.03s	remaining: 19.2s
57:	learn: 0.1690790	total: 5.09s	remaining: 19.1s
58:	learn: 0.1660286	total: 5.21s	remaining: 19.1s
59:	learn: 0.1645095	total: 5.28s	remaining: 18.9s
60:	learn: 0.1625323	total: 5.35s	remaining: 18.8s
61:	learn: 0.1611686	total: 5.43s	remaining: 18.6s
62:	learn: 0.1595158	total: 5.49s	remaining: 18.5s
63:	learn: 0.1582198	total: 5.57s	remaining: 18.4s
64:	learn: 0.1570912	total: 5.64s	remaining: 18.2s
65:	learn: 0.1553351	total: 5.71s	remaining: 18.1s
66:	learn: 0.1539648	total: 5.78s	remaining: 17.9s
67:	learn: 0.1523463	total: 5.86s	remaining: 17.8s
68:	learn: 0.1506154	total: 5.93s	remaining: 17.7s
69:	learn: 0.1491142	total: 5.99s	remaining: 17.5s
70:	learn: 0.1479524	total: 6.06s	remaining: 17.4s
71:	learn: 0.1458774	total: 6.14s	remaining: 17.3s
72:	learn: 0.1447794	total: 6.23s	remaining: 17.2s
73:	learn: 0.1435399	total: 6.3s	remaining: 17.1s
74:	learn: 0.1422228	total: 6.38s	remaining: 17s
75:	learn: 0.1409446	total: 6.46s	remaining: 16.9s
76:	learn: 0.1397232	total: 6.52s	remaining: 16.8s
77:	learn: 0.1382996	total: 6.6s	remaining: 16.7s
78:	learn: 0.1369592	total: 6.67s	remaining: 16.5s
79:	learn: 0.1353952	total: 6.75s	remaining: 16.4s
80:	learn: 0.1343851	total: 6.82s	remaining: 16.3s
81:	learn: 0.1336558	total: 6.88s	remaining: 16.2s
82:	learn: 0.1325691	total: 6.96s	remaining: 16.1s
83:	learn: 0.1315839	total: 7.04s	remaining: 16s
84:	learn: 0.1305907	total: 7.11s	remaining: 15.9s
85:	learn: 0.1297954	total: 7.19s	remaining: 15.8s
86:	learn: 0.1288486	total: 7.29s	remaining: 15.7s
87:	learn: 0.1278659	total: 7.36s	remaining: 15.6s
88:	learn: 0.1270707	total: 7.43s	remaining: 15.5s
89:	learn: 0.1263345	total: 7.5s	remaining: 15.4s
90:	learn: 0.1255818	total: 7.57s	remaining: 15.3s
91:	learn: 0.1248393	total: 7.64s	remaining: 15.2s
92:	learn: 0.1237312	total: 7.72s	remaining: 15.1s
93:	learn: 0.1229093	total: 7.79s	remaining: 15s
94:	learn: 0.1219964	total: 7.87s	remaining: 14.9s
95:	learn: 0.1211162	total: 7.94s	remaining: 14.8s
96:	learn: 0.1204168	total: 8.01s	remaining: 14.7s
97:	learn: 0.1196364	total: 8.08s	remaining: 14.6s
98:	learn: 0.1185674	total: 8.17s	remaining: 14.5s
99:	learn: 0.1177251	total: 8.26s	remaining: 14.4s
100:	learn: 0.1170496	total: 8.34s	remaining: 14.4s
101:	learn: 0.1161968	total: 8.42s	remaining: 14.3s
102:	learn: 0.1153645	total: 8.49s	remaining: 14.2s
103:	learn: 0.1146690	total: 8.56s	remaining: 14.1s
104:	learn: 0.1140813	total: 8.64s	remaining: 14s
105:	learn: 0.1133545	total: 8.72s	remaining: 13.9s
106:	learn: 0.1125883	total: 8.79s	remaining: 13.8s
107:	learn: 0.1118809	total: 8.86s	remaining: 13.7s
108:	learn: 0.1113151	total: 8.93s	remaining: 13.6s
109:	learn: 0.1107618	total: 9s	remaining: 13.5s
110:	learn: 0.1100024	total: 9.07s	remaining: 13.4s
111:	learn: 0.1093883	total: 9.15s	remaining: 13.3s
112:	learn: 0.1086349	total: 9.25s	remaining: 13.3s
113:	learn: 0.1078884	total: 9.34s	remaining: 13.2s
114:	learn: 0.1073060	total: 9.41s	remaining: 13.1s
115:	learn: 0.1067348	total: 9.49s	remaining: 13s
116:	learn: 0.1062708	total: 9.57s	remaining: 12.9s
117:	learn: 0.1056799	total: 9.64s	remaining: 12.8s
118:	learn: 0.1051945	total: 9.72s	remaining: 12.7s
119:	learn: 0.1044699	total: 9.8s	remaining: 12.7s
120:	learn: 0.1037527	total: 9.87s	remaining: 12.6s
121:	learn: 0.1032545	total: 9.94s	remaining: 12.5s
122:	learn: 0.1027374	total: 10s	remaining: 12.4s
123:	learn: 0.1022312	total: 10.1s	remaining: 12.3s
124:	learn: 0.1017365	total: 10.2s	remaining: 12.2s
125:	learn: 0.1011563	total: 10.2s	remaining: 12.1s
126:	learn: 0.1005486	total: 10.3s	remaining: 12s
127:	learn: 0.0999695	total: 10.4s	remaining: 11.9s
128:	learn: 0.0995544	total: 10.5s	remaining: 11.8s
129:	learn: 0.0990758	total: 10.5s	remaining: 11.8s
130:	learn: 0.0986017	total: 10.6s	remaining: 11.7s
131:	learn: 0.0982131	total: 10.7s	remaining: 11.6s
132:	learn: 0.0977131	total: 10.8s	remaining: 11.5s
133:	learn: 0.0971062	total: 10.8s	remaining: 11.4s
134:	learn: 0.0965399	total: 10.9s	remaining: 11.3s
135:	learn: 0.0960341	total: 11s	remaining: 11.2s
136:	learn: 0.0955178	total: 11.1s	remaining: 11.1s
137:	learn: 0.0951045	total: 11.2s	remaining: 11.1s
138:	learn: 0.0946102	total: 11.2s	remaining: 11s
139:	learn: 0.0942430	total: 11.3s	remaining: 10.9s
140:	learn: 0.0938741	total: 11.4s	remaining: 10.9s
141:	learn: 0.0935038	total: 11.6s	remaining: 10.8s
142:	learn: 0.0931532	total: 11.7s	remaining: 10.8s
143:	learn: 0.0927899	total: 11.9s	remaining: 10.8s
144:	learn: 0.0922596	total: 12s	remaining: 10.8s
145:	learn: 0.0918447	total: 12.2s	remaining: 10.8s
146:	learn: 0.0913973	total: 12.4s	remaining: 10.8s
147:	learn: 0.0909998	total: 12.5s	remaining: 10.7s
148:	learn: 0.0906223	total: 12.6s	remaining: 10.7s
149:	learn: 0.0900982	total: 12.8s	remaining: 10.6s
150:	learn: 0.0896690	total: 12.9s	remaining: 10.6s
151:	learn: 0.0892986	total: 13.1s	remaining: 10.6s
152:	learn: 0.0887770	total: 13.2s	remaining: 10.5s
153:	learn: 0.0883482	total: 13.4s	remaining: 10.5s
154:	learn: 0.0879114	total: 13.5s	remaining: 10.5s
155:	learn: 0.0876260	total: 13.7s	remaining: 10.4s
156:	learn: 0.0872000	total: 13.8s	remaining: 10.4s
157:	learn: 0.0867833	total: 14s	remaining: 10.3s
158:	learn: 0.0864059	total: 14.1s	remaining: 10.3s
159:	learn: 0.0860035	total: 14.3s	remaining: 10.3s
160:	learn: 0.0856997	total: 14.4s	remaining: 10.2s
161:	learn: 0.0853341	total: 14.6s	remaining: 10.2s
162:	learn: 0.0850421	total: 14.7s	remaining: 10.1s
163:	learn: 0.0847758	total: 14.9s	remaining: 10.1s
164:	learn: 0.0843047	total: 15.1s	remaining: 10s
165:	learn: 0.0839454	total: 15.2s	remaining: 9.96s
166:	learn: 0.0836519	total: 15.3s	remaining: 9.91s
167:	learn: 0.0833076	total: 15.4s	remaining: 9.84s
168:	learn: 0.0829471	total: 15.6s	remaining: 9.78s
169:	learn: 0.0826747	total: 15.7s	remaining: 9.73s
170:	learn: 0.0824314	total: 15.9s	remaining: 9.67s
171:	learn: 0.0821798	total: 16s	remaining: 9.61s
172:	learn: 0.0818892	total: 16.2s	remaining: 9.54s
173:	learn: 0.0815755	total: 16.3s	remaining: 9.48s
174:	learn: 0.0812358	total: 16.5s	remaining: 9.42s
175:	learn: 0.0810230	total: 16.6s	remaining: 9.36s
176:	learn: 0.0806993	total: 16.8s	remaining: 9.29s
177:	learn: 0.0803977	total: 16.9s	remaining: 9.23s
178:	learn: 0.0800970	total: 17.1s	remaining: 9.16s
179:	learn: 0.0798758	total: 17.2s	remaining: 9.07s
180:	learn: 0.0795932	total: 17.3s	remaining: 9.01s
181:	learn: 0.0792967	total: 17.5s	remaining: 8.94s
182:	learn: 0.0789810	total: 17.6s	remaining: 8.86s
183:	learn: 0.0786055	total: 17.8s	remaining: 8.8s
184:	learn: 0.0783823	total: 17.9s	remaining: 8.73s
185:	learn: 0.0781090	total: 18.1s	remaining: 8.66s
186:	learn: 0.0778700	total: 18.3s	remaining: 8.6s
187:	learn: 0.0776101	total: 18.4s	remaining: 8.52s
188:	learn: 0.0773961	total: 18.6s	remaining: 8.45s
189:	learn: 0.0771368	total: 18.7s	remaining: 8.37s
190:	learn: 0.0768378	total: 18.9s	remaining: 8.3s
191:	learn: 0.0764878	total: 19s	remaining: 8.22s
192:	learn: 0.0762681	total: 19.2s	remaining: 8.14s
193:	learn: 0.0760310	total: 19.3s	remaining: 8.05s
194:	learn: 0.0758384	total: 19.4s	remaining: 7.97s
195:	learn: 0.0754621	total: 19.6s	remaining: 7.89s
196:	learn: 0.0752566	total: 19.7s	remaining: 7.81s
197:	learn: 0.0749551	total: 19.9s	remaining: 7.73s
198:	learn: 0.0747023	total: 20s	remaining: 7.64s
199:	learn: 0.0745183	total: 20.2s	remaining: 7.56s
200:	learn: 0.0742183	total: 20.3s	remaining: 7.47s
201:	learn: 0.0740115	total: 20.5s	remaining: 7.39s
202:	learn: 0.0738689	total: 20.6s	remaining: 7.31s
203:	learn: 0.0736472	total: 20.8s	remaining: 7.23s
204:	learn: 0.0733621	total: 20.9s	remaining: 7.14s
205:	learn: 0.0730540	total: 21s	remaining: 7.04s
206:	learn: 0.0727766	total: 21.1s	remaining: 6.93s
207:	learn: 0.0725148	total: 21.2s	remaining: 6.83s
208:	learn: 0.0723139	total: 21.3s	remaining: 6.71s
209:	learn: 0.0720772	total: 21.3s	remaining: 6.6s
210:	learn: 0.0718972	total: 21.4s	remaining: 6.5s
211:	learn: 0.0716818	total: 21.5s	remaining: 6.39s
212:	learn: 0.0714700	total: 21.6s	remaining: 6.28s
213:	learn: 0.0713489	total: 21.6s	remaining: 6.16s
214:	learn: 0.0711188	total: 21.7s	remaining: 6.05s
215:	learn: 0.0708977	total: 21.8s	remaining: 5.95s
216:	learn: 0.0706880	total: 21.9s	remaining: 5.84s
217:	learn: 0.0703322	total: 21.9s	remaining: 5.73s
218:	learn: 0.0700280	total: 22s	remaining: 5.63s
219:	learn: 0.0698060	total: 22.1s	remaining: 5.52s
220:	learn: 0.0695388	total: 22.2s	remaining: 5.42s
221:	learn: 0.0693021	total: 22.3s	remaining: 5.31s
222:	learn: 0.0690898	total: 22.3s	remaining: 5.21s
223:	learn: 0.0688861	total: 22.4s	remaining: 5.1s
224:	learn: 0.0686395	total: 22.5s	remaining: 5s
225:	learn: 0.0684377	total: 22.6s	remaining: 4.89s
226:	learn: 0.0682263	total: 22.6s	remaining: 4.79s
227:	learn: 0.0679959	total: 22.7s	remaining: 4.68s
228:	learn: 0.0677550	total: 22.8s	remaining: 4.58s
229:	learn: 0.0675146	total: 22.9s	remaining: 4.47s
230:	learn: 0.0673270	total: 22.9s	remaining: 4.37s
231:	learn: 0.0671339	total: 23s	remaining: 4.27s
232:	learn: 0.0669791	total: 23.1s	remaining: 4.17s
233:	learn: 0.0668651	total: 23.2s	remaining: 4.06s
234:	learn: 0.0667236	total: 23.3s	remaining: 3.96s
235:	learn: 0.0665545	total: 23.4s	remaining: 3.86s
236:	learn: 0.0663805	total: 23.4s	remaining: 3.76s
237:	learn: 0.0661747	total: 23.5s	remaining: 3.66s
238:	learn: 0.0659818	total: 23.6s	remaining: 3.55s
239:	learn: 0.0658817	total: 23.6s	remaining: 3.45s
240:	learn: 0.0655886	total: 23.7s	remaining: 3.35s
241:	learn: 0.0652843	total: 23.8s	remaining: 3.25s
242:	learn: 0.0650302	total: 23.9s	remaining: 3.14s
243:	learn: 0.0648527	total: 24s	remaining: 3.04s
244:	learn: 0.0646994	total: 24s	remaining: 2.94s
245:	learn: 0.0644536	total: 24.1s	remaining: 2.84s
246:	learn: 0.0642368	total: 24.2s	remaining: 2.74s
247:	learn: 0.0640416	total: 24.3s	remaining: 2.64s
248:	learn: 0.0639031	total: 24.4s	remaining: 2.54s
249:	learn: 0.0637571	total: 24.4s	remaining: 2.44s
250:	learn: 0.0635738	total: 24.5s	remaining: 2.34s
251:	learn: 0.0634460	total: 24.6s	remaining: 2.24s
252:	learn: 0.0632574	total: 24.7s	remaining: 2.14s
253:	learn: 0.0630930	total: 24.7s	remaining: 2.04s
254:	learn: 0.0628647	total: 24.8s	remaining: 1.95s
255:	learn: 0.0626754	total: 24.9s	remaining: 1.85s
256:	learn: 0.0625514	total: 24.9s	remaining: 1.75s
257:	learn: 0.0623866	total: 25s	remaining: 1.65s
258:	learn: 0.0622396	total: 25.1s	remaining: 1.55s
259:	learn: 0.0621062	total: 25.2s	remaining: 1.45s
260:	learn: 0.0619530	total: 25.3s	remaining: 1.35s
261:	learn: 0.0618312	total: 25.3s	remaining: 1.26s
262:	learn: 0.0616067	total: 25.4s	remaining: 1.16s
263:	learn: 0.0614562	total: 25.5s	remaining: 1.06s
264:	learn: 0.0613094	total: 25.6s	remaining: 965ms
265:	learn: 0.0611875	total: 25.6s	remaining: 868ms
266:	learn: 0.0610054	total: 25.7s	remaining: 771ms
267:	learn: 0.0608349	total: 25.8s	remaining: 674ms
268:	learn: 0.0607080	total: 25.9s	remaining: 577ms
269:	learn: 0.0604653	total: 25.9s	remaining: 480ms
270:	learn: 0.0602977	total: 26s	remaining: 384ms
271:	learn: 0.0600960	total: 26.1s	remaining: 288ms
272:	learn: 0.0599890	total: 26.2s	remaining: 192ms
273:	learn: 0.0597408	total: 26.3s	remaining: 95.9ms
274:	learn: 0.0595960	total: 26.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.74
 - Recall_Train: 99.96
 - AUPRC_Train: 99.71
 - Accuracy_Train: 96.06
 - F1-Score_Train: 96.21
 - Precision_Test: 2.05
 - Recall_Test: 96.83
 - AUPRC_Test: 62.23
 - Accuracy_Test: 92.22
 - F1-Score_Test: 4.02
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6699194	total: 70.8ms	remaining: 19.4s
1:	learn: 0.6502111	total: 133ms	remaining: 18.1s
2:	learn: 0.6216282	total: 201ms	remaining: 18.3s
3:	learn: 0.6050647	total: 269ms	remaining: 18.2s
4:	learn: 0.5882724	total: 345ms	remaining: 18.6s
5:	learn: 0.5675281	total: 410ms	remaining: 18.4s
6:	learn: 0.5419526	total: 485ms	remaining: 18.6s
7:	learn: 0.5208903	total: 560ms	remaining: 18.7s
8:	learn: 0.5031708	total: 638ms	remaining: 18.9s
9:	learn: 0.4830740	total: 711ms	remaining: 18.8s
10:	learn: 0.4678844	total: 788ms	remaining: 18.9s
11:	learn: 0.4510650	total: 857ms	remaining: 18.8s
12:	learn: 0.4374209	total: 942ms	remaining: 19s
13:	learn: 0.4248268	total: 1.02s	remaining: 19s
14:	learn: 0.4110096	total: 1.08s	remaining: 18.8s
15:	learn: 0.3976501	total: 1.15s	remaining: 18.6s
16:	learn: 0.3855370	total: 1.23s	remaining: 18.6s
17:	learn: 0.3731015	total: 1.3s	remaining: 18.5s
18:	learn: 0.3612544	total: 1.37s	remaining: 18.5s
19:	learn: 0.3502343	total: 1.45s	remaining: 18.5s
20:	learn: 0.3397912	total: 1.52s	remaining: 18.4s
21:	learn: 0.3303464	total: 1.59s	remaining: 18.3s
22:	learn: 0.3207815	total: 1.67s	remaining: 18.3s
23:	learn: 0.3110781	total: 1.75s	remaining: 18.3s
24:	learn: 0.3033914	total: 1.81s	remaining: 18.1s
25:	learn: 0.2949836	total: 1.9s	remaining: 18.2s
26:	learn: 0.2858065	total: 1.98s	remaining: 18.2s
27:	learn: 0.2791314	total: 2.05s	remaining: 18.1s
28:	learn: 0.2729299	total: 2.13s	remaining: 18.1s
29:	learn: 0.2642272	total: 2.2s	remaining: 18s
30:	learn: 0.2558317	total: 2.27s	remaining: 17.9s
31:	learn: 0.2504741	total: 2.35s	remaining: 17.9s
32:	learn: 0.2454729	total: 2.42s	remaining: 17.8s
33:	learn: 0.2401209	total: 2.49s	remaining: 17.6s
34:	learn: 0.2352919	total: 2.61s	remaining: 17.9s
35:	learn: 0.2301347	total: 2.72s	remaining: 18.1s
36:	learn: 0.2246292	total: 2.88s	remaining: 18.5s
37:	learn: 0.2203127	total: 3.02s	remaining: 18.8s
38:	learn: 0.2162741	total: 3.17s	remaining: 19.2s
39:	learn: 0.2124867	total: 3.31s	remaining: 19.5s
40:	learn: 0.2085265	total: 3.46s	remaining: 19.7s
41:	learn: 0.2047976	total: 3.58s	remaining: 19.9s
42:	learn: 0.2003830	total: 3.73s	remaining: 20.1s
43:	learn: 0.1971529	total: 3.87s	remaining: 20.3s
44:	learn: 0.1939076	total: 4s	remaining: 20.5s
45:	learn: 0.1911665	total: 4.14s	remaining: 20.6s
46:	learn: 0.1882554	total: 4.26s	remaining: 20.7s
47:	learn: 0.1855205	total: 4.39s	remaining: 20.8s
48:	learn: 0.1828932	total: 4.53s	remaining: 20.9s
49:	learn: 0.1804840	total: 4.64s	remaining: 20.9s
50:	learn: 0.1780318	total: 4.78s	remaining: 21s
51:	learn: 0.1756769	total: 4.91s	remaining: 21s
52:	learn: 0.1732003	total: 5.05s	remaining: 21.2s
53:	learn: 0.1709381	total: 5.19s	remaining: 21.3s
54:	learn: 0.1679577	total: 5.32s	remaining: 21.3s
55:	learn: 0.1655889	total: 5.46s	remaining: 21.4s
56:	learn: 0.1634918	total: 5.6s	remaining: 21.4s
57:	learn: 0.1613791	total: 5.74s	remaining: 21.5s
58:	learn: 0.1593564	total: 5.87s	remaining: 21.5s
59:	learn: 0.1569857	total: 6s	remaining: 21.5s
60:	learn: 0.1549547	total: 6.13s	remaining: 21.5s
61:	learn: 0.1526112	total: 6.27s	remaining: 21.5s
62:	learn: 0.1509898	total: 6.4s	remaining: 21.5s
63:	learn: 0.1495876	total: 6.52s	remaining: 21.5s
64:	learn: 0.1479925	total: 6.67s	remaining: 21.5s
65:	learn: 0.1463749	total: 6.79s	remaining: 21.5s
66:	learn: 0.1448777	total: 6.94s	remaining: 21.6s
67:	learn: 0.1436153	total: 7.07s	remaining: 21.5s
68:	learn: 0.1415984	total: 7.23s	remaining: 21.6s
69:	learn: 0.1403288	total: 7.36s	remaining: 21.6s
70:	learn: 0.1384825	total: 7.52s	remaining: 21.6s
71:	learn: 0.1372079	total: 7.63s	remaining: 21.5s
72:	learn: 0.1345638	total: 7.78s	remaining: 21.5s
73:	learn: 0.1334550	total: 7.92s	remaining: 21.5s
74:	learn: 0.1311842	total: 8.01s	remaining: 21.4s
75:	learn: 0.1292071	total: 8.08s	remaining: 21.2s
76:	learn: 0.1280810	total: 8.15s	remaining: 21s
77:	learn: 0.1266408	total: 8.24s	remaining: 20.8s
78:	learn: 0.1244663	total: 8.32s	remaining: 20.6s
79:	learn: 0.1229638	total: 8.39s	remaining: 20.5s
80:	learn: 0.1214764	total: 8.48s	remaining: 20.3s
81:	learn: 0.1202311	total: 8.55s	remaining: 20.1s
82:	learn: 0.1189680	total: 8.64s	remaining: 20s
83:	learn: 0.1180764	total: 8.71s	remaining: 19.8s
84:	learn: 0.1163046	total: 8.79s	remaining: 19.6s
85:	learn: 0.1153933	total: 8.86s	remaining: 19.5s
86:	learn: 0.1142736	total: 8.95s	remaining: 19.3s
87:	learn: 0.1136036	total: 9.02s	remaining: 19.2s
88:	learn: 0.1129774	total: 9.09s	remaining: 19s
89:	learn: 0.1120304	total: 9.17s	remaining: 18.8s
90:	learn: 0.1106714	total: 9.25s	remaining: 18.7s
91:	learn: 0.1098372	total: 9.33s	remaining: 18.6s
92:	learn: 0.1083727	total: 9.41s	remaining: 18.4s
93:	learn: 0.1075881	total: 9.47s	remaining: 18.2s
94:	learn: 0.1063795	total: 9.54s	remaining: 18.1s
95:	learn: 0.1050229	total: 9.64s	remaining: 18s
96:	learn: 0.1043190	total: 9.71s	remaining: 17.8s
97:	learn: 0.1035692	total: 9.78s	remaining: 17.7s
98:	learn: 0.1028201	total: 9.86s	remaining: 17.5s
99:	learn: 0.1018245	total: 9.93s	remaining: 17.4s
100:	learn: 0.1012160	total: 10s	remaining: 17.2s
101:	learn: 0.1004328	total: 10.1s	remaining: 17.1s
102:	learn: 0.0993005	total: 10.2s	remaining: 17s
103:	learn: 0.0988447	total: 10.2s	remaining: 16.8s
104:	learn: 0.0981249	total: 10.3s	remaining: 16.7s
105:	learn: 0.0972247	total: 10.4s	remaining: 16.6s
106:	learn: 0.0966022	total: 10.5s	remaining: 16.5s
107:	learn: 0.0956964	total: 10.6s	remaining: 16.3s
108:	learn: 0.0949106	total: 10.6s	remaining: 16.2s
109:	learn: 0.0942337	total: 10.7s	remaining: 16.1s
110:	learn: 0.0934635	total: 10.8s	remaining: 16s
111:	learn: 0.0929128	total: 10.9s	remaining: 15.8s
112:	learn: 0.0920759	total: 11s	remaining: 15.7s
113:	learn: 0.0915498	total: 11s	remaining: 15.6s
114:	learn: 0.0911735	total: 11.1s	remaining: 15.5s
115:	learn: 0.0907599	total: 11.2s	remaining: 15.3s
116:	learn: 0.0902119	total: 11.3s	remaining: 15.2s
117:	learn: 0.0898262	total: 11.4s	remaining: 15.1s
118:	learn: 0.0893411	total: 11.4s	remaining: 15s
119:	learn: 0.0885569	total: 11.5s	remaining: 14.9s
120:	learn: 0.0877059	total: 11.6s	remaining: 14.8s
121:	learn: 0.0873783	total: 11.7s	remaining: 14.6s
122:	learn: 0.0868302	total: 11.8s	remaining: 14.5s
123:	learn: 0.0861511	total: 11.8s	remaining: 14.4s
124:	learn: 0.0855487	total: 11.9s	remaining: 14.3s
125:	learn: 0.0852232	total: 12s	remaining: 14.2s
126:	learn: 0.0848622	total: 12.1s	remaining: 14s
127:	learn: 0.0844022	total: 12.1s	remaining: 13.9s
128:	learn: 0.0838963	total: 12.2s	remaining: 13.8s
129:	learn: 0.0834092	total: 12.3s	remaining: 13.7s
130:	learn: 0.0830640	total: 12.4s	remaining: 13.6s
131:	learn: 0.0822455	total: 12.5s	remaining: 13.5s
132:	learn: 0.0817842	total: 12.5s	remaining: 13.4s
133:	learn: 0.0812319	total: 12.6s	remaining: 13.3s
134:	learn: 0.0809707	total: 12.7s	remaining: 13.2s
135:	learn: 0.0804801	total: 12.8s	remaining: 13s
136:	learn: 0.0800801	total: 12.8s	remaining: 12.9s
137:	learn: 0.0796882	total: 12.9s	remaining: 12.8s
138:	learn: 0.0791960	total: 13s	remaining: 12.7s
139:	learn: 0.0786663	total: 13.1s	remaining: 12.6s
140:	learn: 0.0782192	total: 13.1s	remaining: 12.5s
141:	learn: 0.0778658	total: 13.2s	remaining: 12.4s
142:	learn: 0.0775069	total: 13.3s	remaining: 12.3s
143:	learn: 0.0770346	total: 13.4s	remaining: 12.2s
144:	learn: 0.0767455	total: 13.5s	remaining: 12.1s
145:	learn: 0.0764605	total: 13.5s	remaining: 12s
146:	learn: 0.0762020	total: 13.6s	remaining: 11.9s
147:	learn: 0.0756977	total: 13.7s	remaining: 11.8s
148:	learn: 0.0752917	total: 13.8s	remaining: 11.6s
149:	learn: 0.0750493	total: 13.9s	remaining: 11.5s
150:	learn: 0.0746543	total: 13.9s	remaining: 11.4s
151:	learn: 0.0742620	total: 14s	remaining: 11.3s
152:	learn: 0.0738779	total: 14.1s	remaining: 11.2s
153:	learn: 0.0734679	total: 14.2s	remaining: 11.1s
154:	learn: 0.0731331	total: 14.2s	remaining: 11s
155:	learn: 0.0728748	total: 14.3s	remaining: 10.9s
156:	learn: 0.0726881	total: 14.4s	remaining: 10.8s
157:	learn: 0.0724471	total: 14.5s	remaining: 10.7s
158:	learn: 0.0719657	total: 14.6s	remaining: 10.6s
159:	learn: 0.0716607	total: 14.6s	remaining: 10.5s
160:	learn: 0.0714295	total: 14.7s	remaining: 10.4s
161:	learn: 0.0708744	total: 14.8s	remaining: 10.3s
162:	learn: 0.0706300	total: 14.9s	remaining: 10.2s
163:	learn: 0.0704279	total: 14.9s	remaining: 10.1s
164:	learn: 0.0700741	total: 15s	remaining: 10s
165:	learn: 0.0695581	total: 15.1s	remaining: 9.92s
166:	learn: 0.0693636	total: 15.2s	remaining: 9.82s
167:	learn: 0.0690202	total: 15.3s	remaining: 9.73s
168:	learn: 0.0688128	total: 15.3s	remaining: 9.62s
169:	learn: 0.0685298	total: 15.4s	remaining: 9.52s
170:	learn: 0.0682952	total: 15.5s	remaining: 9.43s
171:	learn: 0.0680294	total: 15.6s	remaining: 9.34s
172:	learn: 0.0678318	total: 15.7s	remaining: 9.23s
173:	learn: 0.0674620	total: 15.8s	remaining: 9.14s
174:	learn: 0.0671889	total: 15.8s	remaining: 9.04s
175:	learn: 0.0669134	total: 15.9s	remaining: 8.93s
176:	learn: 0.0665681	total: 16s	remaining: 8.85s
177:	learn: 0.0663041	total: 16s	remaining: 8.74s
178:	learn: 0.0658862	total: 16.1s	remaining: 8.64s
179:	learn: 0.0655254	total: 16.2s	remaining: 8.55s
180:	learn: 0.0652137	total: 16.3s	remaining: 8.45s
181:	learn: 0.0649817	total: 16.3s	remaining: 8.35s
182:	learn: 0.0647552	total: 16.4s	remaining: 8.25s
183:	learn: 0.0644959	total: 16.5s	remaining: 8.16s
184:	learn: 0.0642740	total: 16.6s	remaining: 8.06s
185:	learn: 0.0640230	total: 16.6s	remaining: 7.96s
186:	learn: 0.0638734	total: 16.7s	remaining: 7.87s
187:	learn: 0.0635988	total: 16.8s	remaining: 7.77s
188:	learn: 0.0633123	total: 16.9s	remaining: 7.68s
189:	learn: 0.0631338	total: 17s	remaining: 7.59s
190:	learn: 0.0629021	total: 17s	remaining: 7.49s
191:	learn: 0.0626863	total: 17.1s	remaining: 7.4s
192:	learn: 0.0624522	total: 17.2s	remaining: 7.3s
193:	learn: 0.0623093	total: 17.3s	remaining: 7.2s
194:	learn: 0.0620778	total: 17.3s	remaining: 7.11s
195:	learn: 0.0617514	total: 17.4s	remaining: 7.02s
196:	learn: 0.0615578	total: 17.5s	remaining: 6.93s
197:	learn: 0.0611812	total: 17.6s	remaining: 6.84s
198:	learn: 0.0609938	total: 17.7s	remaining: 6.74s
199:	learn: 0.0607358	total: 17.7s	remaining: 6.65s
200:	learn: 0.0605554	total: 17.8s	remaining: 6.56s
201:	learn: 0.0603580	total: 17.9s	remaining: 6.46s
202:	learn: 0.0601856	total: 18s	remaining: 6.39s
203:	learn: 0.0600140	total: 18.1s	remaining: 6.32s
204:	learn: 0.0598759	total: 18.3s	remaining: 6.25s
205:	learn: 0.0596590	total: 18.4s	remaining: 6.17s
206:	learn: 0.0595257	total: 18.6s	remaining: 6.11s
207:	learn: 0.0593888	total: 18.7s	remaining: 6.03s
208:	learn: 0.0591991	total: 18.9s	remaining: 5.96s
209:	learn: 0.0590572	total: 19s	remaining: 5.88s
210:	learn: 0.0587794	total: 19.2s	remaining: 5.81s
211:	learn: 0.0585925	total: 19.3s	remaining: 5.74s
212:	learn: 0.0584344	total: 19.4s	remaining: 5.66s
213:	learn: 0.0582591	total: 19.6s	remaining: 5.58s
214:	learn: 0.0580323	total: 19.7s	remaining: 5.5s
215:	learn: 0.0578374	total: 19.8s	remaining: 5.42s
216:	learn: 0.0576316	total: 20s	remaining: 5.34s
217:	learn: 0.0574815	total: 20.1s	remaining: 5.26s
218:	learn: 0.0571814	total: 20.3s	remaining: 5.19s
219:	learn: 0.0569974	total: 20.4s	remaining: 5.11s
220:	learn: 0.0568549	total: 20.6s	remaining: 5.03s
221:	learn: 0.0566745	total: 20.7s	remaining: 4.95s
222:	learn: 0.0564198	total: 20.9s	remaining: 4.87s
223:	learn: 0.0562750	total: 21s	remaining: 4.79s
224:	learn: 0.0561119	total: 21.2s	remaining: 4.71s
225:	learn: 0.0559204	total: 21.3s	remaining: 4.62s
226:	learn: 0.0557125	total: 21.4s	remaining: 4.53s
227:	learn: 0.0556012	total: 21.6s	remaining: 4.45s
228:	learn: 0.0554774	total: 21.7s	remaining: 4.36s
229:	learn: 0.0553618	total: 21.9s	remaining: 4.28s
230:	learn: 0.0551354	total: 22s	remaining: 4.19s
231:	learn: 0.0550220	total: 22.1s	remaining: 4.1s
232:	learn: 0.0548856	total: 22.3s	remaining: 4.02s
233:	learn: 0.0547834	total: 22.4s	remaining: 3.93s
234:	learn: 0.0545933	total: 22.6s	remaining: 3.84s
235:	learn: 0.0544907	total: 22.7s	remaining: 3.75s
236:	learn: 0.0543199	total: 22.9s	remaining: 3.66s
237:	learn: 0.0541080	total: 23s	remaining: 3.57s
238:	learn: 0.0539625	total: 23.1s	remaining: 3.48s
239:	learn: 0.0538074	total: 23.3s	remaining: 3.39s
240:	learn: 0.0535907	total: 23.4s	remaining: 3.3s
241:	learn: 0.0533894	total: 23.5s	remaining: 3.2s
242:	learn: 0.0532522	total: 23.5s	remaining: 3.1s
243:	learn: 0.0530356	total: 23.6s	remaining: 3s
244:	learn: 0.0528381	total: 23.7s	remaining: 2.9s
245:	learn: 0.0526599	total: 23.8s	remaining: 2.8s
246:	learn: 0.0524654	total: 23.8s	remaining: 2.7s
247:	learn: 0.0523523	total: 23.9s	remaining: 2.6s
248:	learn: 0.0522254	total: 24s	remaining: 2.51s
249:	learn: 0.0520769	total: 24.1s	remaining: 2.41s
250:	learn: 0.0519759	total: 24.1s	remaining: 2.31s
251:	learn: 0.0518061	total: 24.2s	remaining: 2.21s
252:	learn: 0.0515803	total: 24.3s	remaining: 2.11s
253:	learn: 0.0514454	total: 24.4s	remaining: 2.01s
254:	learn: 0.0513090	total: 24.4s	remaining: 1.92s
255:	learn: 0.0510856	total: 24.6s	remaining: 1.82s
256:	learn: 0.0509563	total: 24.6s	remaining: 1.73s
257:	learn: 0.0508605	total: 24.7s	remaining: 1.63s
258:	learn: 0.0506742	total: 24.8s	remaining: 1.53s
259:	learn: 0.0505840	total: 24.9s	remaining: 1.43s
260:	learn: 0.0504827	total: 24.9s	remaining: 1.34s
261:	learn: 0.0503421	total: 25s	remaining: 1.24s
262:	learn: 0.0502149	total: 25.1s	remaining: 1.15s
263:	learn: 0.0501221	total: 25.2s	remaining: 1.05s
264:	learn: 0.0499516	total: 25.3s	remaining: 953ms
265:	learn: 0.0498571	total: 25.3s	remaining: 857ms
266:	learn: 0.0497165	total: 25.4s	remaining: 761ms
267:	learn: 0.0495790	total: 25.5s	remaining: 666ms
268:	learn: 0.0494531	total: 25.6s	remaining: 570ms
269:	learn: 0.0493420	total: 25.6s	remaining: 475ms
270:	learn: 0.0491801	total: 25.7s	remaining: 380ms
271:	learn: 0.0490258	total: 25.8s	remaining: 285ms
272:	learn: 0.0489141	total: 25.9s	remaining: 190ms
273:	learn: 0.0488381	total: 26s	remaining: 94.8ms
274:	learn: 0.0487186	total: 26s	remaining: 0us
[I 2024-12-19 14:15:21,101] Trial 6 finished with value: 65.7095697060385 and parameters: {'learning_rate': 0.010206730170698685, 'max_depth': 4, 'n_estimators': 275, 'scale_pos_weight': 6.78843201732805}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.04
 - Recall_Train: 99.56
 - AUPRC_Train: 99.74
 - Accuracy_Train: 96.62
 - F1-Score_Train: 96.72
 - Precision_Test: 2.32
 - Recall_Test: 90.48
 - AUPRC_Test: 68.34
 - Accuracy_Test: 93.59
 - F1-Score_Test: 4.53
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.79
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 65.7096

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6607241	total: 81.8ms	remaining: 9.08s
1:	learn: 0.6272280	total: 160ms	remaining: 8.81s
2:	learn: 0.5950888	total: 239ms	remaining: 8.68s
3:	learn: 0.5643685	total: 334ms	remaining: 9.02s
4:	learn: 0.5332826	total: 415ms	remaining: 8.89s
5:	learn: 0.5073801	total: 496ms	remaining: 8.77s
6:	learn: 0.4850175	total: 581ms	remaining: 8.71s
7:	learn: 0.4579854	total: 664ms	remaining: 8.63s
8:	learn: 0.4382681	total: 767ms	remaining: 8.78s
9:	learn: 0.4234227	total: 855ms	remaining: 8.72s
10:	learn: 0.4028967	total: 938ms	remaining: 8.61s
11:	learn: 0.3831562	total: 1.02s	remaining: 8.48s
12:	learn: 0.3629949	total: 1.11s	remaining: 8.44s
13:	learn: 0.3472506	total: 1.19s	remaining: 8.35s
14:	learn: 0.3292385	total: 1.27s	remaining: 8.25s
15:	learn: 0.3145489	total: 1.37s	remaining: 8.24s
16:	learn: 0.2961933	total: 1.46s	remaining: 8.13s
17:	learn: 0.2826436	total: 1.54s	remaining: 8.07s
18:	learn: 0.2698353	total: 1.64s	remaining: 8.01s
19:	learn: 0.2573855	total: 1.73s	remaining: 7.95s
20:	learn: 0.2439327	total: 1.83s	remaining: 7.91s
21:	learn: 0.2318900	total: 1.92s	remaining: 7.84s
22:	learn: 0.2229151	total: 1.99s	remaining: 7.71s
23:	learn: 0.2118969	total: 2.08s	remaining: 7.63s
24:	learn: 0.2042459	total: 2.17s	remaining: 7.56s
25:	learn: 0.1959936	total: 2.25s	remaining: 7.45s
26:	learn: 0.1862816	total: 2.34s	remaining: 7.36s
27:	learn: 0.1766903	total: 2.43s	remaining: 7.29s
28:	learn: 0.1690368	total: 2.52s	remaining: 7.21s
29:	learn: 0.1615110	total: 2.6s	remaining: 7.11s
30:	learn: 0.1566392	total: 2.7s	remaining: 7.05s
31:	learn: 0.1528778	total: 2.77s	remaining: 6.94s
32:	learn: 0.1471410	total: 2.88s	remaining: 6.9s
33:	learn: 0.1421269	total: 2.97s	remaining: 6.82s
34:	learn: 0.1388933	total: 3.05s	remaining: 6.71s
35:	learn: 0.1347014	total: 3.13s	remaining: 6.62s
36:	learn: 0.1305537	total: 3.22s	remaining: 6.53s
37:	learn: 0.1273606	total: 3.31s	remaining: 6.45s
38:	learn: 0.1241640	total: 3.39s	remaining: 6.35s
39:	learn: 0.1218607	total: 3.48s	remaining: 6.27s
40:	learn: 0.1176363	total: 3.57s	remaining: 6.18s
41:	learn: 0.1149824	total: 3.65s	remaining: 6.08s
42:	learn: 0.1130784	total: 3.75s	remaining: 6.02s
43:	learn: 0.1091729	total: 3.85s	remaining: 5.96s
44:	learn: 0.1063616	total: 3.94s	remaining: 5.86s
45:	learn: 0.1049444	total: 4.02s	remaining: 5.77s
46:	learn: 0.1027265	total: 4.11s	remaining: 5.68s
47:	learn: 0.1003478	total: 4.19s	remaining: 5.58s
48:	learn: 0.0975503	total: 4.28s	remaining: 5.5s
49:	learn: 0.0959348	total: 4.37s	remaining: 5.42s
50:	learn: 0.0939947	total: 4.45s	remaining: 5.32s
51:	learn: 0.0916592	total: 4.54s	remaining: 5.24s
52:	learn: 0.0903759	total: 4.62s	remaining: 5.15s
53:	learn: 0.0889802	total: 4.71s	remaining: 5.06s
54:	learn: 0.0876368	total: 4.8s	remaining: 4.97s
55:	learn: 0.0853735	total: 4.9s	remaining: 4.9s
56:	learn: 0.0840147	total: 4.98s	remaining: 4.81s
57:	learn: 0.0821695	total: 5.08s	remaining: 4.73s
58:	learn: 0.0802651	total: 5.2s	remaining: 4.67s
59:	learn: 0.0789671	total: 5.34s	remaining: 4.63s
60:	learn: 0.0778877	total: 5.48s	remaining: 4.58s
61:	learn: 0.0758451	total: 5.64s	remaining: 4.55s
62:	learn: 0.0749074	total: 5.78s	remaining: 4.5s
63:	learn: 0.0737988	total: 5.95s	remaining: 4.46s
64:	learn: 0.0727062	total: 6.11s	remaining: 4.42s
65:	learn: 0.0712376	total: 6.26s	remaining: 4.37s
66:	learn: 0.0702158	total: 6.45s	remaining: 4.33s
67:	learn: 0.0686954	total: 6.59s	remaining: 4.27s
68:	learn: 0.0673949	total: 6.75s	remaining: 4.21s
69:	learn: 0.0659546	total: 6.92s	remaining: 4.15s
70:	learn: 0.0650415	total: 7.08s	remaining: 4.09s
71:	learn: 0.0644228	total: 7.22s	remaining: 4.01s
72:	learn: 0.0636488	total: 7.39s	remaining: 3.95s
73:	learn: 0.0628907	total: 7.57s	remaining: 3.88s
74:	learn: 0.0615958	total: 7.73s	remaining: 3.81s
75:	learn: 0.0607923	total: 7.9s	remaining: 3.74s
76:	learn: 0.0600560	total: 8.07s	remaining: 3.67s
77:	learn: 0.0594075	total: 8.24s	remaining: 3.59s
78:	learn: 0.0584308	total: 8.4s	remaining: 3.51s
79:	learn: 0.0574093	total: 8.57s	remaining: 3.43s
80:	learn: 0.0567480	total: 8.73s	remaining: 3.34s
81:	learn: 0.0561802	total: 8.87s	remaining: 3.24s
82:	learn: 0.0556028	total: 9s	remaining: 3.15s
83:	learn: 0.0548920	total: 9.18s	remaining: 3.06s
84:	learn: 0.0543236	total: 9.33s	remaining: 2.96s
85:	learn: 0.0538293	total: 9.51s	remaining: 2.87s
86:	learn: 0.0533770	total: 9.66s	remaining: 2.78s
87:	learn: 0.0527390	total: 9.84s	remaining: 2.68s
88:	learn: 0.0519822	total: 10s	remaining: 2.59s
89:	learn: 0.0513217	total: 10.2s	remaining: 2.49s
90:	learn: 0.0506774	total: 10.3s	remaining: 2.38s
91:	learn: 0.0502917	total: 10.5s	remaining: 2.28s
92:	learn: 0.0498944	total: 10.6s	remaining: 2.17s
93:	learn: 0.0491515	total: 10.7s	remaining: 2.05s
94:	learn: 0.0486279	total: 10.8s	remaining: 1.93s
95:	learn: 0.0482924	total: 10.9s	remaining: 1.81s
96:	learn: 0.0479256	total: 10.9s	remaining: 1.69s
97:	learn: 0.0473174	total: 11s	remaining: 1.57s
98:	learn: 0.0468620	total: 11.1s	remaining: 1.46s
99:	learn: 0.0464946	total: 11.2s	remaining: 1.34s
100:	learn: 0.0460493	total: 11.3s	remaining: 1.23s
101:	learn: 0.0456429	total: 11.4s	remaining: 1.11s
102:	learn: 0.0452503	total: 11.4s	remaining: 1s
103:	learn: 0.0448733	total: 11.5s	remaining: 888ms
104:	learn: 0.0443278	total: 11.6s	remaining: 775ms
105:	learn: 0.0439276	total: 11.7s	remaining: 663ms
106:	learn: 0.0435453	total: 11.8s	remaining: 551ms
107:	learn: 0.0430507	total: 11.9s	remaining: 440ms
108:	learn: 0.0425845	total: 12s	remaining: 329ms
109:	learn: 0.0421091	total: 12.1s	remaining: 219ms
110:	learn: 0.0416903	total: 12.1s	remaining: 109ms
111:	learn: 0.0412340	total: 12.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.08
 - Recall_Train: 99.99
 - AUPRC_Train: 99.80
 - Accuracy_Train: 95.70
 - F1-Score_Train: 95.87
 - Precision_Test: 1.75
 - Recall_Test: 91.27
 - AUPRC_Test: 68.01
 - Accuracy_Test: 91.35
 - F1-Score_Test: 3.43
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 112
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.44
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6589820	total: 94.7ms	remaining: 10.5s
1:	learn: 0.6244007	total: 171ms	remaining: 9.43s
2:	learn: 0.5906306	total: 254ms	remaining: 9.24s
3:	learn: 0.5650909	total: 342ms	remaining: 9.22s
4:	learn: 0.5366141	total: 420ms	remaining: 8.98s
5:	learn: 0.5057000	total: 509ms	remaining: 8.98s
6:	learn: 0.4788363	total: 601ms	remaining: 9.02s
7:	learn: 0.4608585	total: 690ms	remaining: 8.97s
8:	learn: 0.4336145	total: 769ms	remaining: 8.8s
9:	learn: 0.4153635	total: 866ms	remaining: 8.83s
10:	learn: 0.3948288	total: 945ms	remaining: 8.67s
11:	learn: 0.3750545	total: 1.03s	remaining: 8.58s
12:	learn: 0.3576680	total: 1.14s	remaining: 8.66s
13:	learn: 0.3418698	total: 1.21s	remaining: 8.5s
14:	learn: 0.3270488	total: 1.29s	remaining: 8.35s
15:	learn: 0.3136374	total: 1.38s	remaining: 8.27s
16:	learn: 0.3041162	total: 1.46s	remaining: 8.15s
17:	learn: 0.2918757	total: 1.54s	remaining: 8.06s
18:	learn: 0.2816644	total: 1.64s	remaining: 8.02s
19:	learn: 0.2694396	total: 1.72s	remaining: 7.91s
20:	learn: 0.2615522	total: 1.79s	remaining: 7.77s
21:	learn: 0.2557482	total: 1.88s	remaining: 7.67s
22:	learn: 0.2466405	total: 1.96s	remaining: 7.6s
23:	learn: 0.2404014	total: 2.04s	remaining: 7.46s
24:	learn: 0.2339272	total: 2.15s	remaining: 7.5s
25:	learn: 0.2274500	total: 2.24s	remaining: 7.4s
26:	learn: 0.2199691	total: 2.31s	remaining: 7.28s
27:	learn: 0.2155741	total: 2.4s	remaining: 7.19s
28:	learn: 0.2074087	total: 2.48s	remaining: 7.11s
29:	learn: 0.2004873	total: 2.56s	remaining: 7.01s
30:	learn: 0.1959517	total: 2.65s	remaining: 6.93s
31:	learn: 0.1905383	total: 2.73s	remaining: 6.83s
32:	learn: 0.1859184	total: 2.8s	remaining: 6.71s
33:	learn: 0.1805613	total: 2.9s	remaining: 6.66s
34:	learn: 0.1749652	total: 2.98s	remaining: 6.55s
35:	learn: 0.1719456	total: 3.07s	remaining: 6.47s
36:	learn: 0.1678682	total: 3.16s	remaining: 6.41s
37:	learn: 0.1636755	total: 3.24s	remaining: 6.31s
38:	learn: 0.1597268	total: 3.33s	remaining: 6.22s
39:	learn: 0.1574660	total: 3.41s	remaining: 6.14s
40:	learn: 0.1543337	total: 3.49s	remaining: 6.04s
41:	learn: 0.1500825	total: 3.57s	remaining: 5.95s
42:	learn: 0.1462832	total: 3.66s	remaining: 5.88s
43:	learn: 0.1438534	total: 3.74s	remaining: 5.78s
44:	learn: 0.1401258	total: 3.81s	remaining: 5.68s
45:	learn: 0.1377180	total: 3.9s	remaining: 5.6s
46:	learn: 0.1336767	total: 3.99s	remaining: 5.52s
47:	learn: 0.1307611	total: 4.08s	remaining: 5.44s
48:	learn: 0.1283188	total: 4.17s	remaining: 5.36s
49:	learn: 0.1252998	total: 4.26s	remaining: 5.28s
50:	learn: 0.1228032	total: 4.33s	remaining: 5.18s
51:	learn: 0.1204432	total: 4.42s	remaining: 5.1s
52:	learn: 0.1185111	total: 4.49s	remaining: 5s
53:	learn: 0.1162023	total: 4.57s	remaining: 4.91s
54:	learn: 0.1150217	total: 4.65s	remaining: 4.82s
55:	learn: 0.1133890	total: 4.74s	remaining: 4.74s
56:	learn: 0.1120024	total: 4.81s	remaining: 4.64s
57:	learn: 0.1095625	total: 4.91s	remaining: 4.57s
58:	learn: 0.1079352	total: 4.99s	remaining: 4.48s
59:	learn: 0.1058084	total: 5.08s	remaining: 4.4s
60:	learn: 0.1045493	total: 5.17s	remaining: 4.32s
61:	learn: 0.1025458	total: 5.26s	remaining: 4.24s
62:	learn: 0.1010827	total: 5.34s	remaining: 4.16s
63:	learn: 0.0991268	total: 5.43s	remaining: 4.08s
64:	learn: 0.0979821	total: 5.52s	remaining: 3.99s
65:	learn: 0.0967623	total: 5.59s	remaining: 3.9s
66:	learn: 0.0959459	total: 5.69s	remaining: 3.82s
67:	learn: 0.0946353	total: 5.76s	remaining: 3.73s
68:	learn: 0.0930333	total: 5.85s	remaining: 3.65s
69:	learn: 0.0920683	total: 5.94s	remaining: 3.56s
70:	learn: 0.0909274	total: 6.01s	remaining: 3.47s
71:	learn: 0.0893859	total: 6.1s	remaining: 3.39s
72:	learn: 0.0884647	total: 6.2s	remaining: 3.31s
73:	learn: 0.0872218	total: 6.29s	remaining: 3.23s
74:	learn: 0.0858349	total: 6.41s	remaining: 3.16s
75:	learn: 0.0845358	total: 6.54s	remaining: 3.1s
76:	learn: 0.0834667	total: 6.68s	remaining: 3.03s
77:	learn: 0.0829583	total: 6.83s	remaining: 2.98s
78:	learn: 0.0819880	total: 6.98s	remaining: 2.92s
79:	learn: 0.0812263	total: 7.13s	remaining: 2.85s
80:	learn: 0.0806998	total: 7.3s	remaining: 2.79s
81:	learn: 0.0796598	total: 7.44s	remaining: 2.72s
82:	learn: 0.0786894	total: 7.6s	remaining: 2.65s
83:	learn: 0.0778654	total: 7.75s	remaining: 2.58s
84:	learn: 0.0768817	total: 7.89s	remaining: 2.51s
85:	learn: 0.0761703	total: 8.04s	remaining: 2.43s
86:	learn: 0.0754854	total: 8.2s	remaining: 2.35s
87:	learn: 0.0749634	total: 8.35s	remaining: 2.28s
88:	learn: 0.0739117	total: 8.54s	remaining: 2.21s
89:	learn: 0.0729880	total: 8.7s	remaining: 2.13s
90:	learn: 0.0726097	total: 8.85s	remaining: 2.04s
91:	learn: 0.0720515	total: 9s	remaining: 1.96s
92:	learn: 0.0713107	total: 9.18s	remaining: 1.88s
93:	learn: 0.0705713	total: 9.33s	remaining: 1.79s
94:	learn: 0.0697770	total: 9.51s	remaining: 1.7s
95:	learn: 0.0690324	total: 9.69s	remaining: 1.61s
96:	learn: 0.0685722	total: 9.81s	remaining: 1.52s
97:	learn: 0.0681718	total: 9.96s	remaining: 1.42s
98:	learn: 0.0678737	total: 10.1s	remaining: 1.33s
99:	learn: 0.0671432	total: 10.2s	remaining: 1.23s
100:	learn: 0.0665354	total: 10.4s	remaining: 1.13s
101:	learn: 0.0658450	total: 10.6s	remaining: 1.04s
102:	learn: 0.0653428	total: 10.7s	remaining: 939ms
103:	learn: 0.0648568	total: 10.9s	remaining: 839ms
104:	learn: 0.0644726	total: 11.1s	remaining: 739ms
105:	learn: 0.0637749	total: 11.2s	remaining: 636ms
106:	learn: 0.0632878	total: 11.3s	remaining: 530ms
107:	learn: 0.0626710	total: 11.5s	remaining: 426ms
108:	learn: 0.0621451	total: 11.7s	remaining: 321ms
109:	learn: 0.0617842	total: 11.8s	remaining: 215ms
110:	learn: 0.0614374	total: 11.9s	remaining: 107ms
111:	learn: 0.0608474	total: 12s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 87.79
 - Recall_Train: 100.00
 - AUPRC_Train: 99.61
 - Accuracy_Train: 93.04
 - F1-Score_Train: 93.50
 - Precision_Test: 1.17
 - Recall_Test: 97.62
 - AUPRC_Test: 60.47
 - Accuracy_Test: 86.13
 - F1-Score_Test: 2.31
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 112
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.44
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6605847	total: 74.9ms	remaining: 8.31s
1:	learn: 0.6215119	total: 155ms	remaining: 8.5s
2:	learn: 0.5938650	total: 230ms	remaining: 8.35s
3:	learn: 0.5727141	total: 316ms	remaining: 8.53s
4:	learn: 0.5516370	total: 398ms	remaining: 8.52s
5:	learn: 0.5219787	total: 491ms	remaining: 8.68s
6:	learn: 0.5021364	total: 579ms	remaining: 8.69s
7:	learn: 0.4766601	total: 679ms	remaining: 8.82s
8:	learn: 0.4590541	total: 752ms	remaining: 8.6s
9:	learn: 0.4391289	total: 832ms	remaining: 8.48s
10:	learn: 0.4230778	total: 908ms	remaining: 8.34s
11:	learn: 0.4042897	total: 981ms	remaining: 8.18s
12:	learn: 0.3860032	total: 1.07s	remaining: 8.18s
13:	learn: 0.3707847	total: 1.14s	remaining: 8.01s
14:	learn: 0.3537322	total: 1.22s	remaining: 7.92s
15:	learn: 0.3425128	total: 1.31s	remaining: 7.86s
16:	learn: 0.3266056	total: 1.39s	remaining: 7.76s
17:	learn: 0.3107337	total: 1.47s	remaining: 7.67s
18:	learn: 0.2943400	total: 1.57s	remaining: 7.67s
19:	learn: 0.2824506	total: 1.64s	remaining: 7.56s
20:	learn: 0.2693997	total: 1.74s	remaining: 7.55s
21:	learn: 0.2585398	total: 1.83s	remaining: 7.47s
22:	learn: 0.2476206	total: 1.91s	remaining: 7.38s
23:	learn: 0.2390959	total: 1.98s	remaining: 7.27s
24:	learn: 0.2290062	total: 2.08s	remaining: 7.24s
25:	learn: 0.2195241	total: 2.15s	remaining: 7.13s
26:	learn: 0.2115973	total: 2.24s	remaining: 7.04s
27:	learn: 0.2040708	total: 2.32s	remaining: 6.97s
28:	learn: 0.1970698	total: 2.4s	remaining: 6.86s
29:	learn: 0.1903721	total: 2.48s	remaining: 6.77s
30:	learn: 0.1844896	total: 2.58s	remaining: 6.73s
31:	learn: 0.1766535	total: 2.66s	remaining: 6.64s
32:	learn: 0.1716728	total: 2.75s	remaining: 6.59s
33:	learn: 0.1663021	total: 2.86s	remaining: 6.55s
34:	learn: 0.1595991	total: 2.93s	remaining: 6.46s
35:	learn: 0.1553262	total: 3.02s	remaining: 6.37s
36:	learn: 0.1502878	total: 3.1s	remaining: 6.29s
37:	learn: 0.1457910	total: 3.18s	remaining: 6.2s
38:	learn: 0.1412602	total: 3.26s	remaining: 6.09s
39:	learn: 0.1375961	total: 3.34s	remaining: 6.01s
40:	learn: 0.1329788	total: 3.42s	remaining: 5.92s
41:	learn: 0.1291758	total: 3.5s	remaining: 5.83s
42:	learn: 0.1247652	total: 3.59s	remaining: 5.76s
43:	learn: 0.1216519	total: 3.67s	remaining: 5.67s
44:	learn: 0.1188522	total: 3.77s	remaining: 5.61s
45:	learn: 0.1164833	total: 3.85s	remaining: 5.53s
46:	learn: 0.1143514	total: 3.94s	remaining: 5.44s
47:	learn: 0.1120246	total: 4.01s	remaining: 5.35s
48:	learn: 0.1088445	total: 4.11s	remaining: 5.28s
49:	learn: 0.1067995	total: 4.19s	remaining: 5.19s
50:	learn: 0.1040532	total: 4.27s	remaining: 5.11s
51:	learn: 0.1020638	total: 4.36s	remaining: 5.04s
52:	learn: 0.1000803	total: 4.44s	remaining: 4.95s
53:	learn: 0.0980321	total: 4.53s	remaining: 4.86s
54:	learn: 0.0964639	total: 4.61s	remaining: 4.78s
55:	learn: 0.0948977	total: 4.7s	remaining: 4.7s
56:	learn: 0.0933420	total: 4.79s	remaining: 4.62s
57:	learn: 0.0918784	total: 4.87s	remaining: 4.54s
58:	learn: 0.0905981	total: 4.95s	remaining: 4.45s
59:	learn: 0.0892509	total: 5.02s	remaining: 4.35s
60:	learn: 0.0879943	total: 5.11s	remaining: 4.27s
61:	learn: 0.0865472	total: 5.19s	remaining: 4.18s
62:	learn: 0.0852959	total: 5.26s	remaining: 4.09s
63:	learn: 0.0836697	total: 5.35s	remaining: 4.01s
64:	learn: 0.0820963	total: 5.43s	remaining: 3.93s
65:	learn: 0.0809564	total: 5.51s	remaining: 3.84s
66:	learn: 0.0799170	total: 5.6s	remaining: 3.76s
67:	learn: 0.0791213	total: 5.69s	remaining: 3.68s
68:	learn: 0.0784063	total: 5.76s	remaining: 3.59s
69:	learn: 0.0776140	total: 5.86s	remaining: 3.52s
70:	learn: 0.0766444	total: 5.94s	remaining: 3.43s
71:	learn: 0.0753540	total: 6.03s	remaining: 3.35s
72:	learn: 0.0741464	total: 6.12s	remaining: 3.27s
73:	learn: 0.0728584	total: 6.2s	remaining: 3.18s
74:	learn: 0.0714100	total: 6.28s	remaining: 3.1s
75:	learn: 0.0705236	total: 6.37s	remaining: 3.02s
76:	learn: 0.0693638	total: 6.45s	remaining: 2.93s
77:	learn: 0.0688335	total: 6.54s	remaining: 2.85s
78:	learn: 0.0681994	total: 6.6s	remaining: 2.76s
79:	learn: 0.0670414	total: 6.68s	remaining: 2.67s
80:	learn: 0.0663483	total: 6.76s	remaining: 2.59s
81:	learn: 0.0656696	total: 6.84s	remaining: 2.5s
82:	learn: 0.0649882	total: 6.93s	remaining: 2.42s
83:	learn: 0.0644611	total: 7s	remaining: 2.33s
84:	learn: 0.0637302	total: 7.1s	remaining: 2.25s
85:	learn: 0.0629363	total: 7.18s	remaining: 2.17s
86:	learn: 0.0623994	total: 7.25s	remaining: 2.08s
87:	learn: 0.0618263	total: 7.34s	remaining: 2s
88:	learn: 0.0611780	total: 7.42s	remaining: 1.92s
89:	learn: 0.0607528	total: 7.5s	remaining: 1.83s
90:	learn: 0.0601344	total: 7.59s	remaining: 1.75s
91:	learn: 0.0595526	total: 7.67s	remaining: 1.67s
92:	learn: 0.0590237	total: 7.75s	remaining: 1.58s
93:	learn: 0.0582289	total: 7.89s	remaining: 1.51s
94:	learn: 0.0577239	total: 8.02s	remaining: 1.43s
95:	learn: 0.0572090	total: 8.17s	remaining: 1.36s
96:	learn: 0.0566838	total: 8.3s	remaining: 1.28s
97:	learn: 0.0561494	total: 8.44s	remaining: 1.21s
98:	learn: 0.0557942	total: 8.59s	remaining: 1.13s
99:	learn: 0.0555174	total: 8.74s	remaining: 1.05s
100:	learn: 0.0547718	total: 8.88s	remaining: 967ms
101:	learn: 0.0542617	total: 9.05s	remaining: 887ms
102:	learn: 0.0538234	total: 9.19s	remaining: 803ms
103:	learn: 0.0533386	total: 9.34s	remaining: 718ms
104:	learn: 0.0527721	total: 9.49s	remaining: 633ms
105:	learn: 0.0523239	total: 9.61s	remaining: 544ms
106:	learn: 0.0517423	total: 9.76s	remaining: 456ms
107:	learn: 0.0513019	total: 9.92s	remaining: 367ms
108:	learn: 0.0509838	total: 10.1s	remaining: 277ms
109:	learn: 0.0505978	total: 10.2s	remaining: 186ms
110:	learn: 0.0502245	total: 10.4s	remaining: 93.6ms
111:	learn: 0.0497708	total: 10.5s	remaining: 0us
[I 2024-12-19 14:16:02,525] Trial 7 finished with value: 64.92665531792693 and parameters: {'learning_rate': 0.012295431777560331, 'max_depth': 5, 'n_estimators': 112, 'scale_pos_weight': 12.440833233561072}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 90.72
 - Recall_Train: 99.89
 - AUPRC_Train: 99.65
 - Accuracy_Train: 94.84
 - F1-Score_Train: 95.09
 - Precision_Test: 1.47
 - Recall_Test: 91.27
 - AUPRC_Test: 66.30
 - Accuracy_Test: 89.72
 - F1-Score_Test: 2.90
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 112
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.44
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 64.9267

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6328904	total: 78.9ms	remaining: 11.6s
1:	learn: 0.5820174	total: 154ms	remaining: 11.2s
2:	learn: 0.5369840	total: 230ms	remaining: 11.1s
3:	learn: 0.4958011	total: 313ms	remaining: 11.3s
4:	learn: 0.4493009	total: 394ms	remaining: 11.3s
5:	learn: 0.4053742	total: 479ms	remaining: 11.3s
6:	learn: 0.3738167	total: 575ms	remaining: 11.6s
7:	learn: 0.3455484	total: 667ms	remaining: 11.7s
8:	learn: 0.3165438	total: 747ms	remaining: 11.5s
9:	learn: 0.2912038	total: 845ms	remaining: 11.7s
10:	learn: 0.2676730	total: 927ms	remaining: 11.5s
11:	learn: 0.2473255	total: 1.01s	remaining: 11.5s
12:	learn: 0.2295409	total: 1.1s	remaining: 11.5s
13:	learn: 0.2173888	total: 1.19s	remaining: 11.4s
14:	learn: 0.2022581	total: 1.27s	remaining: 11.3s
15:	learn: 0.1893826	total: 1.37s	remaining: 11.3s
16:	learn: 0.1779444	total: 1.45s	remaining: 11.2s
17:	learn: 0.1665472	total: 1.53s	remaining: 11s
18:	learn: 0.1567664	total: 1.62s	remaining: 11s
19:	learn: 0.1478507	total: 1.71s	remaining: 10.9s
20:	learn: 0.1399427	total: 1.78s	remaining: 10.8s
21:	learn: 0.1318691	total: 1.89s	remaining: 10.8s
22:	learn: 0.1266358	total: 1.96s	remaining: 10.7s
23:	learn: 0.1197269	total: 2.05s	remaining: 10.6s
24:	learn: 0.1159078	total: 2.14s	remaining: 10.5s
25:	learn: 0.1115770	total: 2.23s	remaining: 10.4s
26:	learn: 0.1081138	total: 2.31s	remaining: 10.4s
27:	learn: 0.1056575	total: 2.4s	remaining: 10.3s
28:	learn: 0.1009042	total: 2.48s	remaining: 10.2s
29:	learn: 0.0977225	total: 2.56s	remaining: 10.1s
30:	learn: 0.0943592	total: 2.67s	remaining: 10.1s
31:	learn: 0.0923231	total: 2.75s	remaining: 9.98s
32:	learn: 0.0902861	total: 2.83s	remaining: 9.87s
33:	learn: 0.0879738	total: 2.92s	remaining: 9.79s
34:	learn: 0.0850448	total: 3s	remaining: 9.7s
35:	learn: 0.0819141	total: 3.08s	remaining: 9.6s
36:	learn: 0.0802791	total: 3.18s	remaining: 9.55s
37:	learn: 0.0786842	total: 3.27s	remaining: 9.45s
38:	learn: 0.0772297	total: 3.35s	remaining: 9.36s
39:	learn: 0.0758944	total: 3.43s	remaining: 9.27s
40:	learn: 0.0731411	total: 3.51s	remaining: 9.17s
41:	learn: 0.0711266	total: 3.59s	remaining: 9.07s
42:	learn: 0.0694837	total: 3.69s	remaining: 9.02s
43:	learn: 0.0673017	total: 3.78s	remaining: 8.94s
44:	learn: 0.0652893	total: 3.87s	remaining: 8.85s
45:	learn: 0.0638222	total: 3.96s	remaining: 8.77s
46:	learn: 0.0622069	total: 4.04s	remaining: 8.67s
47:	learn: 0.0610639	total: 4.11s	remaining: 8.57s
48:	learn: 0.0598857	total: 4.21s	remaining: 8.5s
49:	learn: 0.0584201	total: 4.29s	remaining: 8.41s
50:	learn: 0.0572480	total: 4.38s	remaining: 8.33s
51:	learn: 0.0561483	total: 4.47s	remaining: 8.25s
52:	learn: 0.0552320	total: 4.55s	remaining: 8.15s
53:	learn: 0.0544377	total: 4.63s	remaining: 8.06s
54:	learn: 0.0532567	total: 4.74s	remaining: 8.02s
55:	learn: 0.0525537	total: 4.82s	remaining: 7.92s
56:	learn: 0.0515152	total: 4.91s	remaining: 7.84s
57:	learn: 0.0505172	total: 5s	remaining: 7.76s
58:	learn: 0.0498794	total: 5.08s	remaining: 7.66s
59:	learn: 0.0490633	total: 5.15s	remaining: 7.56s
60:	learn: 0.0483845	total: 5.25s	remaining: 7.49s
61:	learn: 0.0477844	total: 5.34s	remaining: 7.41s
62:	learn: 0.0471858	total: 5.42s	remaining: 7.32s
63:	learn: 0.0465951	total: 5.52s	remaining: 7.24s
64:	learn: 0.0458130	total: 5.61s	remaining: 7.16s
65:	learn: 0.0449150	total: 5.68s	remaining: 7.06s
66:	learn: 0.0445340	total: 5.79s	remaining: 7s
67:	learn: 0.0438980	total: 5.87s	remaining: 6.91s
68:	learn: 0.0431997	total: 5.95s	remaining: 6.81s
69:	learn: 0.0427214	total: 6.04s	remaining: 6.72s
70:	learn: 0.0423074	total: 6.11s	remaining: 6.62s
71:	learn: 0.0417851	total: 6.19s	remaining: 6.54s
72:	learn: 0.0414025	total: 6.28s	remaining: 6.46s
73:	learn: 0.0408957	total: 6.37s	remaining: 6.37s
74:	learn: 0.0404457	total: 6.44s	remaining: 6.27s
75:	learn: 0.0401216	total: 6.54s	remaining: 6.19s
76:	learn: 0.0397852	total: 6.61s	remaining: 6.09s
77:	learn: 0.0394441	total: 6.69s	remaining: 6s
78:	learn: 0.0389502	total: 6.78s	remaining: 5.92s
79:	learn: 0.0384111	total: 6.87s	remaining: 5.84s
80:	learn: 0.0381109	total: 6.95s	remaining: 5.75s
81:	learn: 0.0377005	total: 7.03s	remaining: 5.66s
82:	learn: 0.0372440	total: 7.1s	remaining: 5.56s
83:	learn: 0.0369001	total: 7.19s	remaining: 5.47s
84:	learn: 0.0365140	total: 7.28s	remaining: 5.4s
85:	learn: 0.0361836	total: 7.37s	remaining: 5.31s
86:	learn: 0.0359004	total: 7.45s	remaining: 5.22s
87:	learn: 0.0356093	total: 7.54s	remaining: 5.14s
88:	learn: 0.0352931	total: 7.62s	remaining: 5.05s
89:	learn: 0.0349220	total: 7.7s	remaining: 4.96s
90:	learn: 0.0346185	total: 7.79s	remaining: 4.88s
91:	learn: 0.0342715	total: 7.88s	remaining: 4.79s
92:	learn: 0.0339838	total: 7.95s	remaining: 4.7s
93:	learn: 0.0336878	total: 8.04s	remaining: 4.62s
94:	learn: 0.0334058	total: 8.12s	remaining: 4.53s
95:	learn: 0.0330239	total: 8.2s	remaining: 4.44s
96:	learn: 0.0328430	total: 8.28s	remaining: 4.35s
97:	learn: 0.0325129	total: 8.37s	remaining: 4.27s
98:	learn: 0.0321825	total: 8.46s	remaining: 4.19s
99:	learn: 0.0319289	total: 8.54s	remaining: 4.1s
100:	learn: 0.0317387	total: 8.62s	remaining: 4.01s
101:	learn: 0.0314457	total: 8.7s	remaining: 3.92s
102:	learn: 0.0312066	total: 8.8s	remaining: 3.84s
103:	learn: 0.0309015	total: 8.89s	remaining: 3.76s
104:	learn: 0.0306142	total: 8.97s	remaining: 3.67s
105:	learn: 0.0303392	total: 9.06s	remaining: 3.59s
106:	learn: 0.0301594	total: 9.14s	remaining: 3.5s
107:	learn: 0.0299073	total: 9.21s	remaining: 3.41s
108:	learn: 0.0296258	total: 9.3s	remaining: 3.33s
109:	learn: 0.0293949	total: 9.38s	remaining: 3.24s
110:	learn: 0.0291855	total: 9.46s	remaining: 3.15s
111:	learn: 0.0289688	total: 9.55s	remaining: 3.07s
112:	learn: 0.0287420	total: 9.63s	remaining: 2.98s
113:	learn: 0.0284893	total: 9.72s	remaining: 2.9s
114:	learn: 0.0282727	total: 9.84s	remaining: 2.83s
115:	learn: 0.0280426	total: 9.97s	remaining: 2.75s
116:	learn: 0.0277881	total: 10.1s	remaining: 2.68s
117:	learn: 0.0276425	total: 10.3s	remaining: 2.61s
118:	learn: 0.0274066	total: 10.4s	remaining: 2.54s
119:	learn: 0.0271974	total: 10.6s	remaining: 2.46s
120:	learn: 0.0270051	total: 10.7s	remaining: 2.39s
121:	learn: 0.0267854	total: 10.8s	remaining: 2.31s
122:	learn: 0.0265278	total: 11s	remaining: 2.23s
123:	learn: 0.0263290	total: 11.1s	remaining: 2.15s
124:	learn: 0.0260998	total: 11.3s	remaining: 2.08s
125:	learn: 0.0258861	total: 11.4s	remaining: 1.99s
126:	learn: 0.0256531	total: 11.6s	remaining: 1.91s
127:	learn: 0.0254728	total: 11.7s	remaining: 1.83s
128:	learn: 0.0252807	total: 11.9s	remaining: 1.75s
129:	learn: 0.0250419	total: 12s	remaining: 1.66s
130:	learn: 0.0248143	total: 12.1s	remaining: 1.57s
131:	learn: 0.0246513	total: 12.3s	remaining: 1.49s
132:	learn: 0.0244575	total: 12.4s	remaining: 1.4s
133:	learn: 0.0242437	total: 12.6s	remaining: 1.32s
134:	learn: 0.0240397	total: 12.8s	remaining: 1.23s
135:	learn: 0.0239021	total: 12.9s	remaining: 1.14s
136:	learn: 0.0237627	total: 13.1s	remaining: 1.05s
137:	learn: 0.0236484	total: 13.2s	remaining: 956ms
138:	learn: 0.0234544	total: 13.4s	remaining: 866ms
139:	learn: 0.0232292	total: 13.5s	remaining: 772ms
140:	learn: 0.0230554	total: 13.7s	remaining: 679ms
141:	learn: 0.0228651	total: 13.8s	remaining: 584ms
142:	learn: 0.0226738	total: 14s	remaining: 490ms
143:	learn: 0.0225608	total: 14.2s	remaining: 393ms
144:	learn: 0.0224278	total: 14.3s	remaining: 297ms
145:	learn: 0.0221782	total: 14.5s	remaining: 199ms
146:	learn: 0.0220132	total: 14.6s	remaining: 99.6ms
147:	learn: 0.0219104	total: 14.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.35
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 98.11
 - F1-Score_Train: 98.14
 - Precision_Test: 3.79
 - Recall_Test: 91.27
 - AUPRC_Test: 71.31
 - Accuracy_Test: 96.08
 - F1-Score_Test: 7.27
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 148
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.01
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6390756	total: 74.2ms	remaining: 10.9s
1:	learn: 0.5802155	total: 155ms	remaining: 11.3s
2:	learn: 0.5237646	total: 255ms	remaining: 12.3s
3:	learn: 0.4825340	total: 339ms	remaining: 12.2s
4:	learn: 0.4446192	total: 418ms	remaining: 12s
5:	learn: 0.4146794	total: 496ms	remaining: 11.7s
6:	learn: 0.3847542	total: 583ms	remaining: 11.7s
7:	learn: 0.3596930	total: 661ms	remaining: 11.6s
8:	learn: 0.3368957	total: 737ms	remaining: 11.4s
9:	learn: 0.3142123	total: 833ms	remaining: 11.5s
10:	learn: 0.2924797	total: 911ms	remaining: 11.3s
11:	learn: 0.2742431	total: 996ms	remaining: 11.3s
12:	learn: 0.2552276	total: 1.1s	remaining: 11.4s
13:	learn: 0.2421086	total: 1.18s	remaining: 11.3s
14:	learn: 0.2282250	total: 1.28s	remaining: 11.4s
15:	learn: 0.2168808	total: 1.38s	remaining: 11.4s
16:	learn: 0.2056597	total: 1.46s	remaining: 11.3s
17:	learn: 0.1952299	total: 1.54s	remaining: 11.1s
18:	learn: 0.1838349	total: 1.63s	remaining: 11.1s
19:	learn: 0.1782116	total: 1.71s	remaining: 10.9s
20:	learn: 0.1712199	total: 1.79s	remaining: 10.8s
21:	learn: 0.1657124	total: 1.87s	remaining: 10.7s
22:	learn: 0.1591375	total: 1.95s	remaining: 10.6s
23:	learn: 0.1528708	total: 2.03s	remaining: 10.5s
24:	learn: 0.1467552	total: 2.12s	remaining: 10.4s
25:	learn: 0.1417494	total: 2.19s	remaining: 10.3s
26:	learn: 0.1366706	total: 2.29s	remaining: 10.3s
27:	learn: 0.1338093	total: 2.38s	remaining: 10.2s
28:	learn: 0.1312711	total: 2.46s	remaining: 10.1s
29:	learn: 0.1270981	total: 2.54s	remaining: 9.98s
30:	learn: 0.1249457	total: 2.62s	remaining: 9.9s
31:	learn: 0.1218585	total: 2.7s	remaining: 9.79s
32:	learn: 0.1188070	total: 2.78s	remaining: 9.69s
33:	learn: 0.1166428	total: 2.87s	remaining: 9.62s
34:	learn: 0.1138263	total: 2.94s	remaining: 9.49s
35:	learn: 0.1100744	total: 3.03s	remaining: 9.42s
36:	learn: 0.1076254	total: 3.12s	remaining: 9.36s
37:	learn: 0.1050553	total: 3.21s	remaining: 9.29s
38:	learn: 0.1024293	total: 3.31s	remaining: 9.24s
39:	learn: 0.1008160	total: 3.4s	remaining: 9.17s
40:	learn: 0.0988957	total: 3.48s	remaining: 9.08s
41:	learn: 0.0969331	total: 3.56s	remaining: 8.98s
42:	learn: 0.0950251	total: 3.65s	remaining: 8.91s
43:	learn: 0.0937549	total: 3.73s	remaining: 8.8s
44:	learn: 0.0916975	total: 3.81s	remaining: 8.72s
45:	learn: 0.0899827	total: 3.89s	remaining: 8.63s
46:	learn: 0.0883838	total: 3.98s	remaining: 8.55s
47:	learn: 0.0870675	total: 4.05s	remaining: 8.45s
48:	learn: 0.0861967	total: 4.15s	remaining: 8.38s
49:	learn: 0.0849207	total: 4.23s	remaining: 8.3s
50:	learn: 0.0836851	total: 4.33s	remaining: 8.24s
51:	learn: 0.0820601	total: 4.42s	remaining: 8.17s
52:	learn: 0.0806836	total: 4.51s	remaining: 8.08s
53:	learn: 0.0793664	total: 4.59s	remaining: 7.98s
54:	learn: 0.0783598	total: 4.68s	remaining: 7.92s
55:	learn: 0.0769023	total: 4.76s	remaining: 7.83s
56:	learn: 0.0759862	total: 4.84s	remaining: 7.73s
57:	learn: 0.0750922	total: 4.93s	remaining: 7.65s
58:	learn: 0.0740485	total: 5.01s	remaining: 7.56s
59:	learn: 0.0732929	total: 5.09s	remaining: 7.46s
60:	learn: 0.0725125	total: 5.18s	remaining: 7.39s
61:	learn: 0.0719635	total: 5.27s	remaining: 7.3s
62:	learn: 0.0710278	total: 5.36s	remaining: 7.23s
63:	learn: 0.0701300	total: 5.45s	remaining: 7.15s
64:	learn: 0.0694732	total: 5.53s	remaining: 7.05s
65:	learn: 0.0684091	total: 5.61s	remaining: 6.97s
66:	learn: 0.0674577	total: 5.7s	remaining: 6.89s
67:	learn: 0.0666893	total: 5.78s	remaining: 6.8s
68:	learn: 0.0658794	total: 5.87s	remaining: 6.72s
69:	learn: 0.0651344	total: 5.96s	remaining: 6.64s
70:	learn: 0.0643402	total: 6.04s	remaining: 6.55s
71:	learn: 0.0635261	total: 6.12s	remaining: 6.46s
72:	learn: 0.0630385	total: 6.23s	remaining: 6.4s
73:	learn: 0.0622952	total: 6.3s	remaining: 6.3s
74:	learn: 0.0613706	total: 6.4s	remaining: 6.23s
75:	learn: 0.0608956	total: 6.49s	remaining: 6.15s
76:	learn: 0.0604260	total: 6.57s	remaining: 6.05s
77:	learn: 0.0598749	total: 6.64s	remaining: 5.96s
78:	learn: 0.0594545	total: 6.73s	remaining: 5.88s
79:	learn: 0.0587842	total: 6.82s	remaining: 5.79s
80:	learn: 0.0581884	total: 6.9s	remaining: 5.71s
81:	learn: 0.0577471	total: 6.99s	remaining: 5.63s
82:	learn: 0.0572925	total: 7.07s	remaining: 5.54s
83:	learn: 0.0567063	total: 7.15s	remaining: 5.45s
84:	learn: 0.0560440	total: 7.25s	remaining: 5.38s
85:	learn: 0.0557810	total: 7.33s	remaining: 5.28s
86:	learn: 0.0551932	total: 7.43s	remaining: 5.21s
87:	learn: 0.0547769	total: 7.51s	remaining: 5.12s
88:	learn: 0.0543003	total: 7.59s	remaining: 5.03s
89:	learn: 0.0538754	total: 7.67s	remaining: 4.95s
90:	learn: 0.0533154	total: 7.77s	remaining: 4.87s
91:	learn: 0.0530445	total: 7.84s	remaining: 4.78s
92:	learn: 0.0523351	total: 7.92s	remaining: 4.68s
93:	learn: 0.0518317	total: 8.01s	remaining: 4.6s
94:	learn: 0.0512594	total: 8.09s	remaining: 4.51s
95:	learn: 0.0508932	total: 8.17s	remaining: 4.43s
96:	learn: 0.0504257	total: 8.26s	remaining: 4.34s
97:	learn: 0.0501399	total: 8.35s	remaining: 4.26s
98:	learn: 0.0497189	total: 8.51s	remaining: 4.21s
99:	learn: 0.0494611	total: 8.64s	remaining: 4.15s
100:	learn: 0.0491150	total: 8.8s	remaining: 4.1s
101:	learn: 0.0487101	total: 8.97s	remaining: 4.04s
102:	learn: 0.0484298	total: 9.12s	remaining: 3.98s
103:	learn: 0.0479151	total: 9.28s	remaining: 3.93s
104:	learn: 0.0475374	total: 9.43s	remaining: 3.86s
105:	learn: 0.0471458	total: 9.59s	remaining: 3.8s
106:	learn: 0.0468384	total: 9.74s	remaining: 3.73s
107:	learn: 0.0463127	total: 9.87s	remaining: 3.65s
108:	learn: 0.0460426	total: 10s	remaining: 3.58s
109:	learn: 0.0456542	total: 10.2s	remaining: 3.52s
110:	learn: 0.0453657	total: 10.3s	remaining: 3.44s
111:	learn: 0.0449585	total: 10.5s	remaining: 3.36s
112:	learn: 0.0447369	total: 10.6s	remaining: 3.29s
113:	learn: 0.0442303	total: 10.8s	remaining: 3.22s
114:	learn: 0.0437483	total: 10.9s	remaining: 3.14s
115:	learn: 0.0433324	total: 11.1s	remaining: 3.06s
116:	learn: 0.0430880	total: 11.3s	remaining: 2.99s
117:	learn: 0.0427653	total: 11.4s	remaining: 2.91s
118:	learn: 0.0424747	total: 11.6s	remaining: 2.83s
119:	learn: 0.0422498	total: 11.7s	remaining: 2.74s
120:	learn: 0.0418657	total: 11.9s	remaining: 2.66s
121:	learn: 0.0415531	total: 12s	remaining: 2.57s
122:	learn: 0.0413352	total: 12.2s	remaining: 2.48s
123:	learn: 0.0409363	total: 12.3s	remaining: 2.39s
124:	learn: 0.0406755	total: 12.5s	remaining: 2.3s
125:	learn: 0.0404933	total: 12.7s	remaining: 2.21s
126:	learn: 0.0402947	total: 12.8s	remaining: 2.12s
127:	learn: 0.0399852	total: 13s	remaining: 2.03s
128:	learn: 0.0397217	total: 13.2s	remaining: 1.94s
129:	learn: 0.0392569	total: 13.3s	remaining: 1.84s
130:	learn: 0.0390024	total: 13.5s	remaining: 1.75s
131:	learn: 0.0386580	total: 13.7s	remaining: 1.66s
132:	learn: 0.0382826	total: 13.8s	remaining: 1.55s
133:	learn: 0.0381247	total: 13.9s	remaining: 1.45s
134:	learn: 0.0378799	total: 13.9s	remaining: 1.34s
135:	learn: 0.0376246	total: 14s	remaining: 1.24s
136:	learn: 0.0374034	total: 14.1s	remaining: 1.13s
137:	learn: 0.0371854	total: 14.2s	remaining: 1.03s
138:	learn: 0.0369071	total: 14.3s	remaining: 924ms
139:	learn: 0.0365725	total: 14.4s	remaining: 820ms
140:	learn: 0.0362562	total: 14.4s	remaining: 717ms
141:	learn: 0.0359988	total: 14.5s	remaining: 614ms
142:	learn: 0.0357550	total: 14.6s	remaining: 511ms
143:	learn: 0.0355088	total: 14.7s	remaining: 408ms
144:	learn: 0.0352382	total: 14.8s	remaining: 306ms
145:	learn: 0.0349542	total: 14.9s	remaining: 204ms
146:	learn: 0.0348249	total: 14.9s	remaining: 102ms
147:	learn: 0.0346445	total: 15s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.91
 - Recall_Train: 100.00
 - AUPRC_Train: 99.86
 - Accuracy_Train: 97.32
 - F1-Score_Train: 97.39
 - Precision_Test: 2.93
 - Recall_Test: 95.24
 - AUPRC_Test: 63.39
 - Accuracy_Test: 94.68
 - F1-Score_Test: 5.68
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 148
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.01
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6362143	total: 74.8ms	remaining: 11s
1:	learn: 0.5783390	total: 156ms	remaining: 11.4s
2:	learn: 0.5421574	total: 229ms	remaining: 11s
3:	learn: 0.4964515	total: 316ms	remaining: 11.4s
4:	learn: 0.4545840	total: 393ms	remaining: 11.3s
5:	learn: 0.4134586	total: 475ms	remaining: 11.2s
6:	learn: 0.3884590	total: 558ms	remaining: 11.2s
7:	learn: 0.3602238	total: 643ms	remaining: 11.3s
8:	learn: 0.3283196	total: 721ms	remaining: 11.1s
9:	learn: 0.3004483	total: 831ms	remaining: 11.5s
10:	learn: 0.2771970	total: 904ms	remaining: 11.3s
11:	learn: 0.2600007	total: 983ms	remaining: 11.1s
12:	learn: 0.2454611	total: 1.06s	remaining: 11.1s
13:	learn: 0.2302118	total: 1.15s	remaining: 11s
14:	learn: 0.2168848	total: 1.23s	remaining: 10.9s
15:	learn: 0.2065786	total: 1.31s	remaining: 10.8s
16:	learn: 0.1917742	total: 1.39s	remaining: 10.7s
17:	learn: 0.1835269	total: 1.46s	remaining: 10.6s
18:	learn: 0.1748531	total: 1.55s	remaining: 10.5s
19:	learn: 0.1666043	total: 1.63s	remaining: 10.4s
20:	learn: 0.1592968	total: 1.7s	remaining: 10.3s
21:	learn: 0.1516756	total: 1.8s	remaining: 10.3s
22:	learn: 0.1455010	total: 1.89s	remaining: 10.2s
23:	learn: 0.1403925	total: 1.96s	remaining: 10.1s
24:	learn: 0.1351780	total: 2.05s	remaining: 10.1s
25:	learn: 0.1302858	total: 2.14s	remaining: 10.1s
26:	learn: 0.1266306	total: 2.21s	remaining: 9.91s
27:	learn: 0.1227286	total: 2.3s	remaining: 9.86s
28:	learn: 0.1180059	total: 2.38s	remaining: 9.75s
29:	learn: 0.1137527	total: 2.46s	remaining: 9.66s
30:	learn: 0.1109983	total: 2.55s	remaining: 9.61s
31:	learn: 0.1079280	total: 2.63s	remaining: 9.52s
32:	learn: 0.1052250	total: 2.71s	remaining: 9.44s
33:	learn: 0.1021362	total: 2.79s	remaining: 9.37s
34:	learn: 0.0995316	total: 2.89s	remaining: 9.34s
35:	learn: 0.0972595	total: 2.97s	remaining: 9.24s
36:	learn: 0.0945129	total: 3.06s	remaining: 9.17s
37:	learn: 0.0925189	total: 3.14s	remaining: 9.09s
38:	learn: 0.0893447	total: 3.22s	remaining: 9.01s
39:	learn: 0.0875007	total: 3.31s	remaining: 8.93s
40:	learn: 0.0858577	total: 3.38s	remaining: 8.83s
41:	learn: 0.0837986	total: 3.46s	remaining: 8.74s
42:	learn: 0.0824426	total: 3.55s	remaining: 8.67s
43:	learn: 0.0815149	total: 3.63s	remaining: 8.57s
44:	learn: 0.0802742	total: 3.71s	remaining: 8.48s
45:	learn: 0.0787709	total: 3.79s	remaining: 8.41s
46:	learn: 0.0766042	total: 3.89s	remaining: 8.36s
47:	learn: 0.0756609	total: 3.96s	remaining: 8.26s
48:	learn: 0.0741676	total: 4.05s	remaining: 8.18s
49:	learn: 0.0727913	total: 4.14s	remaining: 8.11s
50:	learn: 0.0714645	total: 4.23s	remaining: 8.04s
51:	learn: 0.0701502	total: 4.32s	remaining: 7.97s
52:	learn: 0.0693244	total: 4.4s	remaining: 7.88s
53:	learn: 0.0679267	total: 4.48s	remaining: 7.8s
54:	learn: 0.0669804	total: 4.57s	remaining: 7.73s
55:	learn: 0.0655595	total: 4.64s	remaining: 7.63s
56:	learn: 0.0648793	total: 4.72s	remaining: 7.53s
57:	learn: 0.0641022	total: 4.82s	remaining: 7.47s
58:	learn: 0.0632495	total: 4.9s	remaining: 7.39s
59:	learn: 0.0624430	total: 4.99s	remaining: 7.31s
60:	learn: 0.0614989	total: 5.08s	remaining: 7.25s
61:	learn: 0.0607940	total: 5.17s	remaining: 7.17s
62:	learn: 0.0601049	total: 5.25s	remaining: 7.08s
63:	learn: 0.0593220	total: 5.34s	remaining: 7.01s
64:	learn: 0.0584676	total: 5.41s	remaining: 6.91s
65:	learn: 0.0576025	total: 5.5s	remaining: 6.83s
66:	learn: 0.0568061	total: 5.59s	remaining: 6.75s
67:	learn: 0.0563715	total: 5.66s	remaining: 6.66s
68:	learn: 0.0553003	total: 5.75s	remaining: 6.58s
69:	learn: 0.0546174	total: 5.84s	remaining: 6.51s
70:	learn: 0.0539618	total: 5.94s	remaining: 6.44s
71:	learn: 0.0535045	total: 6.02s	remaining: 6.35s
72:	learn: 0.0529712	total: 6.11s	remaining: 6.28s
73:	learn: 0.0522815	total: 6.2s	remaining: 6.2s
74:	learn: 0.0518044	total: 6.27s	remaining: 6.1s
75:	learn: 0.0510725	total: 6.35s	remaining: 6.02s
76:	learn: 0.0507579	total: 6.43s	remaining: 5.93s
77:	learn: 0.0504195	total: 6.5s	remaining: 5.83s
78:	learn: 0.0501160	total: 6.59s	remaining: 5.75s
79:	learn: 0.0494411	total: 6.66s	remaining: 5.66s
80:	learn: 0.0486262	total: 6.79s	remaining: 5.61s
81:	learn: 0.0479366	total: 6.93s	remaining: 5.58s
82:	learn: 0.0475545	total: 7.08s	remaining: 5.54s
83:	learn: 0.0469043	total: 7.23s	remaining: 5.51s
84:	learn: 0.0465120	total: 7.38s	remaining: 5.47s
85:	learn: 0.0461327	total: 7.55s	remaining: 5.44s
86:	learn: 0.0455114	total: 7.71s	remaining: 5.41s
87:	learn: 0.0451914	total: 7.87s	remaining: 5.37s
88:	learn: 0.0447606	total: 8.04s	remaining: 5.33s
89:	learn: 0.0443424	total: 8.21s	remaining: 5.29s
90:	learn: 0.0439246	total: 8.34s	remaining: 5.22s
91:	learn: 0.0436891	total: 8.49s	remaining: 5.17s
92:	learn: 0.0432702	total: 8.64s	remaining: 5.11s
93:	learn: 0.0427737	total: 8.81s	remaining: 5.06s
94:	learn: 0.0424358	total: 8.97s	remaining: 5s
95:	learn: 0.0419157	total: 9.15s	remaining: 4.96s
96:	learn: 0.0417283	total: 9.3s	remaining: 4.89s
97:	learn: 0.0413485	total: 9.46s	remaining: 4.83s
98:	learn: 0.0410499	total: 9.63s	remaining: 4.76s
99:	learn: 0.0406745	total: 9.79s	remaining: 4.7s
100:	learn: 0.0402100	total: 9.95s	remaining: 4.63s
101:	learn: 0.0396945	total: 10.1s	remaining: 4.56s
102:	learn: 0.0394503	total: 10.3s	remaining: 4.49s
103:	learn: 0.0391133	total: 10.4s	remaining: 4.41s
104:	learn: 0.0387695	total: 10.6s	remaining: 4.33s
105:	learn: 0.0384058	total: 10.7s	remaining: 4.25s
106:	learn: 0.0381209	total: 10.9s	remaining: 4.18s
107:	learn: 0.0378938	total: 11.1s	remaining: 4.09s
108:	learn: 0.0375098	total: 11.2s	remaining: 4.01s
109:	learn: 0.0373688	total: 11.4s	remaining: 3.92s
110:	learn: 0.0370843	total: 11.5s	remaining: 3.84s
111:	learn: 0.0366928	total: 11.7s	remaining: 3.75s
112:	learn: 0.0363854	total: 11.8s	remaining: 3.66s
113:	learn: 0.0361436	total: 12s	remaining: 3.57s
114:	learn: 0.0360150	total: 12.1s	remaining: 3.48s
115:	learn: 0.0357702	total: 12.2s	remaining: 3.38s
116:	learn: 0.0354612	total: 12.3s	remaining: 3.27s
117:	learn: 0.0351807	total: 12.4s	remaining: 3.15s
118:	learn: 0.0348490	total: 12.5s	remaining: 3.05s
119:	learn: 0.0345283	total: 12.6s	remaining: 2.94s
120:	learn: 0.0342626	total: 12.7s	remaining: 2.83s
121:	learn: 0.0339899	total: 12.8s	remaining: 2.72s
122:	learn: 0.0336798	total: 12.8s	remaining: 2.61s
123:	learn: 0.0334292	total: 12.9s	remaining: 2.5s
124:	learn: 0.0332324	total: 13s	remaining: 2.39s
125:	learn: 0.0329976	total: 13.1s	remaining: 2.29s
126:	learn: 0.0327830	total: 13.2s	remaining: 2.18s
127:	learn: 0.0324452	total: 13.3s	remaining: 2.07s
128:	learn: 0.0322473	total: 13.4s	remaining: 1.97s
129:	learn: 0.0321223	total: 13.4s	remaining: 1.86s
130:	learn: 0.0318641	total: 13.5s	remaining: 1.76s
131:	learn: 0.0316168	total: 13.6s	remaining: 1.65s
132:	learn: 0.0313279	total: 13.7s	remaining: 1.54s
133:	learn: 0.0310899	total: 13.8s	remaining: 1.44s
134:	learn: 0.0309283	total: 13.9s	remaining: 1.33s
135:	learn: 0.0308063	total: 13.9s	remaining: 1.23s
136:	learn: 0.0305642	total: 14s	remaining: 1.13s
137:	learn: 0.0304050	total: 14.1s	remaining: 1.02s
138:	learn: 0.0302027	total: 14.2s	remaining: 919ms
139:	learn: 0.0300645	total: 14.3s	remaining: 816ms
140:	learn: 0.0298742	total: 14.4s	remaining: 714ms
141:	learn: 0.0296944	total: 14.5s	remaining: 611ms
142:	learn: 0.0295083	total: 14.6s	remaining: 509ms
143:	learn: 0.0292767	total: 14.6s	remaining: 407ms
144:	learn: 0.0291788	total: 14.7s	remaining: 304ms
145:	learn: 0.0288855	total: 14.8s	remaining: 203ms
146:	learn: 0.0287301	total: 14.9s	remaining: 101ms
147:	learn: 0.0285876	total: 15s	remaining: 0us
[I 2024-12-19 14:16:54,410] Trial 8 finished with value: 67.96113894002374 and parameters: {'learning_rate': 0.02247052435322354, 'max_depth': 5, 'n_estimators': 148, 'scale_pos_weight': 8.009318253271207}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.50
 - Recall_Train: 100.00
 - AUPRC_Train: 99.86
 - Accuracy_Train: 97.65
 - F1-Score_Train: 97.70
 - Precision_Test: 3.10
 - Recall_Test: 90.48
 - AUPRC_Test: 69.19
 - Accuracy_Test: 95.23
 - F1-Score_Test: 6.00
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 148
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.01
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 67.9611

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6653267	total: 76.6ms	remaining: 17.8s
1:	learn: 0.6343843	total: 157ms	remaining: 18.2s
2:	learn: 0.6029590	total: 241ms	remaining: 18.5s
3:	learn: 0.5795372	total: 328ms	remaining: 18.8s
4:	learn: 0.5534308	total: 421ms	remaining: 19.3s
5:	learn: 0.5306989	total: 514ms	remaining: 19.5s
6:	learn: 0.5044069	total: 605ms	remaining: 19.6s
7:	learn: 0.4812742	total: 688ms	remaining: 19.4s
8:	learn: 0.4574218	total: 770ms	remaining: 19.3s
9:	learn: 0.4393909	total: 863ms	remaining: 19.3s
10:	learn: 0.4201437	total: 941ms	remaining: 19.1s
11:	learn: 0.4025803	total: 1.02s	remaining: 18.9s
12:	learn: 0.3845106	total: 1.12s	remaining: 19s
13:	learn: 0.3682613	total: 1.21s	remaining: 19s
14:	learn: 0.3553435	total: 1.28s	remaining: 18.8s
15:	learn: 0.3400332	total: 1.38s	remaining: 18.9s
16:	learn: 0.3246456	total: 1.48s	remaining: 18.9s
17:	learn: 0.3110325	total: 1.56s	remaining: 18.8s
18:	learn: 0.2997000	total: 1.66s	remaining: 18.8s
19:	learn: 0.2879256	total: 1.75s	remaining: 18.7s
20:	learn: 0.2763225	total: 1.83s	remaining: 18.6s
21:	learn: 0.2657894	total: 1.93s	remaining: 18.6s
22:	learn: 0.2551789	total: 2.02s	remaining: 18.5s
23:	learn: 0.2461507	total: 2.1s	remaining: 18.4s
24:	learn: 0.2374851	total: 2.2s	remaining: 18.4s
25:	learn: 0.2287108	total: 2.28s	remaining: 18.2s
26:	learn: 0.2211382	total: 2.36s	remaining: 18.1s
27:	learn: 0.2137017	total: 2.47s	remaining: 18.2s
28:	learn: 0.2080465	total: 2.55s	remaining: 18.1s
29:	learn: 0.2015233	total: 2.63s	remaining: 17.9s
30:	learn: 0.1951743	total: 2.73s	remaining: 17.9s
31:	learn: 0.1888068	total: 2.81s	remaining: 17.7s
32:	learn: 0.1832688	total: 2.89s	remaining: 17.6s
33:	learn: 0.1776252	total: 2.98s	remaining: 17.5s
34:	learn: 0.1715909	total: 3.06s	remaining: 17.4s
35:	learn: 0.1677864	total: 3.15s	remaining: 17.3s
36:	learn: 0.1631165	total: 3.24s	remaining: 17.2s
37:	learn: 0.1585043	total: 3.33s	remaining: 17.2s
38:	learn: 0.1555184	total: 3.41s	remaining: 17.1s
39:	learn: 0.1511265	total: 3.52s	remaining: 17.1s
40:	learn: 0.1461574	total: 3.6s	remaining: 17s
41:	learn: 0.1434251	total: 3.69s	remaining: 16.9s
42:	learn: 0.1401436	total: 3.78s	remaining: 16.8s
43:	learn: 0.1376053	total: 3.87s	remaining: 16.7s
44:	learn: 0.1343324	total: 3.95s	remaining: 16.6s
45:	learn: 0.1307071	total: 4.06s	remaining: 16.6s
46:	learn: 0.1275531	total: 4.13s	remaining: 16.5s
47:	learn: 0.1244647	total: 4.22s	remaining: 16.4s
48:	learn: 0.1219463	total: 4.32s	remaining: 16.3s
49:	learn: 0.1194410	total: 4.39s	remaining: 16.2s
50:	learn: 0.1172171	total: 4.48s	remaining: 16.1s
51:	learn: 0.1154538	total: 4.58s	remaining: 16s
52:	learn: 0.1131757	total: 4.67s	remaining: 15.9s
53:	learn: 0.1114339	total: 4.75s	remaining: 15.8s
54:	learn: 0.1092515	total: 4.84s	remaining: 15.8s
55:	learn: 0.1079719	total: 4.93s	remaining: 15.7s
56:	learn: 0.1065053	total: 5.01s	remaining: 15.6s
57:	learn: 0.1049803	total: 5.1s	remaining: 15.5s
58:	learn: 0.1025603	total: 5.23s	remaining: 15.5s
59:	learn: 0.1007284	total: 5.39s	remaining: 15.6s
60:	learn: 0.0995856	total: 5.55s	remaining: 15.7s
61:	learn: 0.0984666	total: 5.72s	remaining: 15.9s
62:	learn: 0.0972162	total: 5.88s	remaining: 15.9s
63:	learn: 0.0961370	total: 6.02s	remaining: 16s
64:	learn: 0.0951127	total: 6.17s	remaining: 16s
65:	learn: 0.0940994	total: 6.33s	remaining: 16.1s
66:	learn: 0.0929884	total: 6.49s	remaining: 16.2s
67:	learn: 0.0920347	total: 6.65s	remaining: 16.2s
68:	learn: 0.0910376	total: 6.8s	remaining: 16.3s
69:	learn: 0.0898373	total: 6.95s	remaining: 16.3s
70:	learn: 0.0884721	total: 7.11s	remaining: 16.3s
71:	learn: 0.0876500	total: 7.26s	remaining: 16.3s
72:	learn: 0.0864005	total: 7.42s	remaining: 16.4s
73:	learn: 0.0848113	total: 7.59s	remaining: 16.4s
74:	learn: 0.0840602	total: 7.74s	remaining: 16.4s
75:	learn: 0.0829626	total: 7.9s	remaining: 16.4s
76:	learn: 0.0821798	total: 8.05s	remaining: 16.4s
77:	learn: 0.0814471	total: 8.23s	remaining: 16.5s
78:	learn: 0.0804817	total: 8.38s	remaining: 16.4s
79:	learn: 0.0790465	total: 8.53s	remaining: 16.4s
80:	learn: 0.0784827	total: 8.67s	remaining: 16.4s
81:	learn: 0.0776048	total: 8.83s	remaining: 16.4s
82:	learn: 0.0764830	total: 8.99s	remaining: 16.4s
83:	learn: 0.0757292	total: 9.16s	remaining: 16.4s
84:	learn: 0.0750509	total: 9.31s	remaining: 16.3s
85:	learn: 0.0744472	total: 9.49s	remaining: 16.3s
86:	learn: 0.0737556	total: 9.65s	remaining: 16.3s
87:	learn: 0.0730348	total: 9.82s	remaining: 16.3s
88:	learn: 0.0719768	total: 9.98s	remaining: 16.3s
89:	learn: 0.0711968	total: 10.1s	remaining: 16.2s
90:	learn: 0.0707114	total: 10.3s	remaining: 16.2s
91:	learn: 0.0701320	total: 10.4s	remaining: 16.1s
92:	learn: 0.0696312	total: 10.5s	remaining: 16s
93:	learn: 0.0688254	total: 10.6s	remaining: 15.8s
94:	learn: 0.0682014	total: 10.7s	remaining: 15.7s
95:	learn: 0.0677483	total: 10.8s	remaining: 15.5s
96:	learn: 0.0672798	total: 10.9s	remaining: 15.3s
97:	learn: 0.0666761	total: 11s	remaining: 15.2s
98:	learn: 0.0661034	total: 11s	remaining: 15.1s
99:	learn: 0.0653392	total: 11.1s	remaining: 14.9s
100:	learn: 0.0647848	total: 11.2s	remaining: 14.8s
101:	learn: 0.0642638	total: 11.3s	remaining: 14.6s
102:	learn: 0.0638747	total: 11.4s	remaining: 14.5s
103:	learn: 0.0632987	total: 11.5s	remaining: 14.3s
104:	learn: 0.0628593	total: 11.5s	remaining: 14.2s
105:	learn: 0.0622835	total: 11.6s	remaining: 14s
106:	learn: 0.0618433	total: 11.7s	remaining: 13.9s
107:	learn: 0.0612921	total: 11.8s	remaining: 13.8s
108:	learn: 0.0608079	total: 11.9s	remaining: 13.6s
109:	learn: 0.0603901	total: 12s	remaining: 13.5s
110:	learn: 0.0596998	total: 12.1s	remaining: 13.4s
111:	learn: 0.0592222	total: 12.1s	remaining: 13.2s
112:	learn: 0.0587772	total: 12.2s	remaining: 13.1s
113:	learn: 0.0583518	total: 12.3s	remaining: 13s
114:	learn: 0.0579161	total: 12.4s	remaining: 12.8s
115:	learn: 0.0574714	total: 12.5s	remaining: 12.7s
116:	learn: 0.0569087	total: 12.6s	remaining: 12.6s
117:	learn: 0.0565275	total: 12.6s	remaining: 12.4s
118:	learn: 0.0562324	total: 12.7s	remaining: 12.3s
119:	learn: 0.0559442	total: 12.8s	remaining: 12.2s
120:	learn: 0.0555914	total: 12.9s	remaining: 12s
121:	learn: 0.0553023	total: 13s	remaining: 11.9s
122:	learn: 0.0548718	total: 13.1s	remaining: 11.8s
123:	learn: 0.0543494	total: 13.2s	remaining: 11.7s
124:	learn: 0.0540469	total: 13.2s	remaining: 11.5s
125:	learn: 0.0537812	total: 13.3s	remaining: 11.4s
126:	learn: 0.0534628	total: 13.4s	remaining: 11.3s
127:	learn: 0.0530481	total: 13.5s	remaining: 11.2s
128:	learn: 0.0527757	total: 13.6s	remaining: 11s
129:	learn: 0.0522048	total: 13.6s	remaining: 10.9s
130:	learn: 0.0519593	total: 13.7s	remaining: 10.8s
131:	learn: 0.0515017	total: 13.8s	remaining: 10.7s
132:	learn: 0.0512539	total: 13.9s	remaining: 10.6s
133:	learn: 0.0508853	total: 14s	remaining: 10.5s
134:	learn: 0.0506698	total: 14.1s	remaining: 10.3s
135:	learn: 0.0503129	total: 14.2s	remaining: 10.2s
136:	learn: 0.0499325	total: 14.3s	remaining: 10.1s
137:	learn: 0.0496937	total: 14.3s	remaining: 9.98s
138:	learn: 0.0492254	total: 14.4s	remaining: 9.86s
139:	learn: 0.0488489	total: 14.5s	remaining: 9.75s
140:	learn: 0.0484586	total: 14.6s	remaining: 9.63s
141:	learn: 0.0480955	total: 14.7s	remaining: 9.52s
142:	learn: 0.0479148	total: 14.8s	remaining: 9.4s
143:	learn: 0.0476729	total: 14.9s	remaining: 9.28s
144:	learn: 0.0473769	total: 14.9s	remaining: 9.17s
145:	learn: 0.0471644	total: 15s	remaining: 9.06s
146:	learn: 0.0468622	total: 15.1s	remaining: 8.95s
147:	learn: 0.0465676	total: 15.2s	remaining: 8.84s
148:	learn: 0.0463720	total: 15.3s	remaining: 8.72s
149:	learn: 0.0459572	total: 15.4s	remaining: 8.61s
150:	learn: 0.0456750	total: 15.5s	remaining: 8.5s
151:	learn: 0.0454431	total: 15.5s	remaining: 8.39s
152:	learn: 0.0452028	total: 15.6s	remaining: 8.28s
153:	learn: 0.0449494	total: 15.7s	remaining: 8.16s
154:	learn: 0.0447652	total: 15.8s	remaining: 8.05s
155:	learn: 0.0445150	total: 15.9s	remaining: 7.94s
156:	learn: 0.0443128	total: 16s	remaining: 7.83s
157:	learn: 0.0441067	total: 16.1s	remaining: 7.73s
158:	learn: 0.0438421	total: 16.1s	remaining: 7.61s
159:	learn: 0.0435985	total: 16.2s	remaining: 7.5s
160:	learn: 0.0433963	total: 16.3s	remaining: 7.4s
161:	learn: 0.0431379	total: 16.4s	remaining: 7.29s
162:	learn: 0.0428655	total: 16.5s	remaining: 7.18s
163:	learn: 0.0426301	total: 16.6s	remaining: 7.07s
164:	learn: 0.0424044	total: 16.6s	remaining: 6.96s
165:	learn: 0.0421926	total: 16.7s	remaining: 6.85s
166:	learn: 0.0419914	total: 16.8s	remaining: 6.75s
167:	learn: 0.0416656	total: 16.9s	remaining: 6.64s
168:	learn: 0.0414432	total: 17s	remaining: 6.53s
169:	learn: 0.0412534	total: 17.1s	remaining: 6.43s
170:	learn: 0.0410386	total: 17.2s	remaining: 6.32s
171:	learn: 0.0408479	total: 17.2s	remaining: 6.21s
172:	learn: 0.0406436	total: 17.3s	remaining: 6.11s
173:	learn: 0.0404711	total: 17.4s	remaining: 6s
174:	learn: 0.0402415	total: 17.5s	remaining: 5.89s
175:	learn: 0.0400380	total: 17.6s	remaining: 5.79s
176:	learn: 0.0398938	total: 17.6s	remaining: 5.68s
177:	learn: 0.0397171	total: 17.7s	remaining: 5.58s
178:	learn: 0.0395323	total: 17.8s	remaining: 5.48s
179:	learn: 0.0393503	total: 17.9s	remaining: 5.37s
180:	learn: 0.0391876	total: 18s	remaining: 5.27s
181:	learn: 0.0390628	total: 18.1s	remaining: 5.17s
182:	learn: 0.0388507	total: 18.2s	remaining: 5.06s
183:	learn: 0.0385939	total: 18.2s	remaining: 4.96s
184:	learn: 0.0383642	total: 18.3s	remaining: 4.86s
185:	learn: 0.0381693	total: 18.4s	remaining: 4.75s
186:	learn: 0.0380210	total: 18.5s	remaining: 4.65s
187:	learn: 0.0378056	total: 18.6s	remaining: 4.55s
188:	learn: 0.0376120	total: 18.7s	remaining: 4.44s
189:	learn: 0.0374153	total: 18.7s	remaining: 4.34s
190:	learn: 0.0372053	total: 18.8s	remaining: 4.24s
191:	learn: 0.0370360	total: 18.9s	remaining: 4.14s
192:	learn: 0.0368849	total: 19s	remaining: 4.04s
193:	learn: 0.0367375	total: 19.1s	remaining: 3.94s
194:	learn: 0.0365939	total: 19.2s	remaining: 3.84s
195:	learn: 0.0363978	total: 19.3s	remaining: 3.73s
196:	learn: 0.0362644	total: 19.3s	remaining: 3.63s
197:	learn: 0.0360764	total: 19.4s	remaining: 3.53s
198:	learn: 0.0359107	total: 19.5s	remaining: 3.43s
199:	learn: 0.0357796	total: 19.6s	remaining: 3.33s
200:	learn: 0.0356045	total: 19.7s	remaining: 3.23s
201:	learn: 0.0354600	total: 19.8s	remaining: 3.13s
202:	learn: 0.0352964	total: 19.8s	remaining: 3.03s
203:	learn: 0.0351036	total: 19.9s	remaining: 2.93s
204:	learn: 0.0349398	total: 20s	remaining: 2.83s
205:	learn: 0.0348011	total: 20.1s	remaining: 2.73s
206:	learn: 0.0345779	total: 20.2s	remaining: 2.64s
207:	learn: 0.0344650	total: 20.3s	remaining: 2.54s
208:	learn: 0.0343455	total: 20.4s	remaining: 2.44s
209:	learn: 0.0342050	total: 20.5s	remaining: 2.34s
210:	learn: 0.0340649	total: 20.6s	remaining: 2.25s
211:	learn: 0.0339067	total: 20.7s	remaining: 2.15s
212:	learn: 0.0337529	total: 20.9s	remaining: 2.06s
213:	learn: 0.0336256	total: 21s	remaining: 1.97s
214:	learn: 0.0334425	total: 21.2s	remaining: 1.88s
215:	learn: 0.0332722	total: 21.4s	remaining: 1.78s
216:	learn: 0.0331220	total: 21.5s	remaining: 1.69s
217:	learn: 0.0329782	total: 21.7s	remaining: 1.59s
218:	learn: 0.0328859	total: 21.9s	remaining: 1.5s
219:	learn: 0.0327443	total: 22s	remaining: 1.4s
220:	learn: 0.0326216	total: 22.2s	remaining: 1.3s
221:	learn: 0.0325208	total: 22.3s	remaining: 1.21s
222:	learn: 0.0324128	total: 22.5s	remaining: 1.11s
223:	learn: 0.0322981	total: 22.6s	remaining: 1.01s
224:	learn: 0.0321510	total: 22.8s	remaining: 912ms
225:	learn: 0.0320179	total: 22.9s	remaining: 812ms
226:	learn: 0.0318726	total: 23.1s	remaining: 713ms
227:	learn: 0.0317204	total: 23.3s	remaining: 613ms
228:	learn: 0.0315733	total: 23.4s	remaining: 512ms
229:	learn: 0.0314298	total: 23.6s	remaining: 411ms
230:	learn: 0.0313166	total: 23.8s	remaining: 309ms
231:	learn: 0.0312033	total: 23.9s	remaining: 206ms
232:	learn: 0.0310746	total: 24.1s	remaining: 103ms
233:	learn: 0.0309675	total: 24.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.01
 - Recall_Train: 99.99
 - AUPRC_Train: 99.90
 - Accuracy_Train: 97.92
 - F1-Score_Train: 97.96
 - Precision_Test: 3.43
 - Recall_Test: 91.27
 - AUPRC_Test: 69.34
 - Accuracy_Test: 95.67
 - F1-Score_Test: 6.62
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 234
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6673010	total: 77.9ms	remaining: 18.1s
1:	learn: 0.6459430	total: 156ms	remaining: 18.1s
2:	learn: 0.6151667	total: 239ms	remaining: 18.4s
3:	learn: 0.5880903	total: 334ms	remaining: 19.2s
4:	learn: 0.5634650	total: 418ms	remaining: 19.1s
5:	learn: 0.5422156	total: 501ms	remaining: 19s
6:	learn: 0.5187272	total: 608ms	remaining: 19.7s
7:	learn: 0.4949067	total: 707ms	remaining: 20s
8:	learn: 0.4746536	total: 790ms	remaining: 19.7s
9:	learn: 0.4587774	total: 881ms	remaining: 19.7s
10:	learn: 0.4431068	total: 955ms	remaining: 19.4s
11:	learn: 0.4249881	total: 1.04s	remaining: 19.2s
12:	learn: 0.4088650	total: 1.14s	remaining: 19.3s
13:	learn: 0.3956616	total: 1.22s	remaining: 19.2s
14:	learn: 0.3813099	total: 1.3s	remaining: 19s
15:	learn: 0.3649254	total: 1.4s	remaining: 19.1s
16:	learn: 0.3516360	total: 1.48s	remaining: 18.9s
17:	learn: 0.3391217	total: 1.56s	remaining: 18.8s
18:	learn: 0.3265573	total: 1.67s	remaining: 18.9s
19:	learn: 0.3157368	total: 1.75s	remaining: 18.7s
20:	learn: 0.3062460	total: 1.83s	remaining: 18.6s
21:	learn: 0.2964461	total: 1.92s	remaining: 18.5s
22:	learn: 0.2858683	total: 2s	remaining: 18.4s
23:	learn: 0.2776233	total: 2.1s	remaining: 18.3s
24:	learn: 0.2694162	total: 2.18s	remaining: 18.2s
25:	learn: 0.2613834	total: 2.26s	remaining: 18.1s
26:	learn: 0.2546087	total: 2.34s	remaining: 18s
27:	learn: 0.2473949	total: 2.43s	remaining: 17.9s
28:	learn: 0.2407093	total: 2.52s	remaining: 17.8s
29:	learn: 0.2351449	total: 2.6s	remaining: 17.7s
30:	learn: 0.2301923	total: 2.7s	remaining: 17.7s
31:	learn: 0.2238118	total: 2.78s	remaining: 17.6s
32:	learn: 0.2180757	total: 2.88s	remaining: 17.5s
33:	learn: 0.2116926	total: 2.98s	remaining: 17.5s
34:	learn: 0.2075670	total: 3.07s	remaining: 17.4s
35:	learn: 0.2016854	total: 3.15s	remaining: 17.4s
36:	learn: 0.1977362	total: 3.24s	remaining: 17.3s
37:	learn: 0.1933050	total: 3.32s	remaining: 17.1s
38:	learn: 0.1879210	total: 3.41s	remaining: 17.1s
39:	learn: 0.1850781	total: 3.5s	remaining: 17s
40:	learn: 0.1811260	total: 3.58s	remaining: 16.9s
41:	learn: 0.1775152	total: 3.68s	remaining: 16.8s
42:	learn: 0.1741647	total: 3.78s	remaining: 16.8s
43:	learn: 0.1713729	total: 3.86s	remaining: 16.7s
44:	learn: 0.1685258	total: 3.94s	remaining: 16.6s
45:	learn: 0.1655280	total: 4.03s	remaining: 16.5s
46:	learn: 0.1628445	total: 4.11s	remaining: 16.4s
47:	learn: 0.1600148	total: 4.19s	remaining: 16.2s
48:	learn: 0.1573443	total: 4.27s	remaining: 16.1s
49:	learn: 0.1549477	total: 4.35s	remaining: 16s
50:	learn: 0.1526558	total: 4.42s	remaining: 15.9s
51:	learn: 0.1501892	total: 4.51s	remaining: 15.8s
52:	learn: 0.1480559	total: 4.59s	remaining: 15.7s
53:	learn: 0.1466812	total: 4.66s	remaining: 15.5s
54:	learn: 0.1449233	total: 4.77s	remaining: 15.5s
55:	learn: 0.1425865	total: 4.85s	remaining: 15.4s
56:	learn: 0.1407136	total: 4.94s	remaining: 15.3s
57:	learn: 0.1392298	total: 5.03s	remaining: 15.3s
58:	learn: 0.1378679	total: 5.11s	remaining: 15.2s
59:	learn: 0.1357959	total: 5.19s	remaining: 15.1s
60:	learn: 0.1343722	total: 5.28s	remaining: 15s
61:	learn: 0.1324742	total: 5.36s	remaining: 14.9s
62:	learn: 0.1306652	total: 5.45s	remaining: 14.8s
63:	learn: 0.1286779	total: 5.54s	remaining: 14.7s
64:	learn: 0.1272308	total: 5.61s	remaining: 14.6s
65:	learn: 0.1255753	total: 5.7s	remaining: 14.5s
66:	learn: 0.1241731	total: 5.8s	remaining: 14.5s
67:	learn: 0.1229825	total: 5.88s	remaining: 14.4s
68:	learn: 0.1216564	total: 5.96s	remaining: 14.3s
69:	learn: 0.1204577	total: 6.06s	remaining: 14.2s
70:	learn: 0.1191499	total: 6.14s	remaining: 14.1s
71:	learn: 0.1180363	total: 6.22s	remaining: 14s
72:	learn: 0.1169237	total: 6.32s	remaining: 13.9s
73:	learn: 0.1157799	total: 6.41s	remaining: 13.8s
74:	learn: 0.1145148	total: 6.5s	remaining: 13.8s
75:	learn: 0.1135466	total: 6.58s	remaining: 13.7s
76:	learn: 0.1125590	total: 6.67s	remaining: 13.6s
77:	learn: 0.1116373	total: 6.74s	remaining: 13.5s
78:	learn: 0.1104271	total: 6.85s	remaining: 13.4s
79:	learn: 0.1091334	total: 6.93s	remaining: 13.3s
80:	learn: 0.1081810	total: 7.01s	remaining: 13.2s
81:	learn: 0.1074618	total: 7.1s	remaining: 13.2s
82:	learn: 0.1062951	total: 7.19s	remaining: 13.1s
83:	learn: 0.1051577	total: 7.27s	remaining: 13s
84:	learn: 0.1044030	total: 7.35s	remaining: 12.9s
85:	learn: 0.1034863	total: 7.44s	remaining: 12.8s
86:	learn: 0.1027285	total: 7.53s	remaining: 12.7s
87:	learn: 0.1019858	total: 7.62s	remaining: 12.6s
88:	learn: 0.1010675	total: 7.69s	remaining: 12.5s
89:	learn: 0.1002370	total: 7.77s	remaining: 12.4s
90:	learn: 0.0994873	total: 7.87s	remaining: 12.4s
91:	learn: 0.0984348	total: 7.96s	remaining: 12.3s
92:	learn: 0.0975760	total: 8.04s	remaining: 12.2s
93:	learn: 0.0967646	total: 8.15s	remaining: 12.1s
94:	learn: 0.0959083	total: 8.23s	remaining: 12s
95:	learn: 0.0950781	total: 8.31s	remaining: 11.9s
96:	learn: 0.0943479	total: 8.4s	remaining: 11.9s
97:	learn: 0.0936317	total: 8.48s	remaining: 11.8s
98:	learn: 0.0929327	total: 8.57s	remaining: 11.7s
99:	learn: 0.0922270	total: 8.66s	remaining: 11.6s
100:	learn: 0.0910193	total: 8.75s	remaining: 11.5s
101:	learn: 0.0904849	total: 8.85s	remaining: 11.4s
102:	learn: 0.0897511	total: 8.99s	remaining: 11.4s
103:	learn: 0.0891552	total: 9.14s	remaining: 11.4s
104:	learn: 0.0885192	total: 9.3s	remaining: 11.4s
105:	learn: 0.0880068	total: 9.46s	remaining: 11.4s
106:	learn: 0.0873825	total: 9.62s	remaining: 11.4s
107:	learn: 0.0868717	total: 9.76s	remaining: 11.4s
108:	learn: 0.0861677	total: 9.93s	remaining: 11.4s
109:	learn: 0.0855202	total: 10.1s	remaining: 11.4s
110:	learn: 0.0848834	total: 10.3s	remaining: 11.4s
111:	learn: 0.0842881	total: 10.4s	remaining: 11.4s
112:	learn: 0.0836967	total: 10.6s	remaining: 11.3s
113:	learn: 0.0830938	total: 10.7s	remaining: 11.3s
114:	learn: 0.0824009	total: 10.9s	remaining: 11.2s
115:	learn: 0.0817628	total: 11s	remaining: 11.2s
116:	learn: 0.0812356	total: 11.2s	remaining: 11.2s
117:	learn: 0.0807852	total: 11.3s	remaining: 11.2s
118:	learn: 0.0802248	total: 11.5s	remaining: 11.1s
119:	learn: 0.0796282	total: 11.7s	remaining: 11.1s
120:	learn: 0.0791846	total: 11.8s	remaining: 11.1s
121:	learn: 0.0787177	total: 12s	remaining: 11s
122:	learn: 0.0783048	total: 12.2s	remaining: 11s
123:	learn: 0.0778788	total: 12.3s	remaining: 10.9s
124:	learn: 0.0775101	total: 12.5s	remaining: 10.9s
125:	learn: 0.0768700	total: 12.6s	remaining: 10.8s
126:	learn: 0.0764014	total: 12.8s	remaining: 10.8s
127:	learn: 0.0759535	total: 12.9s	remaining: 10.7s
128:	learn: 0.0754229	total: 13.1s	remaining: 10.7s
129:	learn: 0.0750178	total: 13.2s	remaining: 10.6s
130:	learn: 0.0745813	total: 13.4s	remaining: 10.5s
131:	learn: 0.0740205	total: 13.5s	remaining: 10.5s
132:	learn: 0.0736240	total: 13.7s	remaining: 10.4s
133:	learn: 0.0732257	total: 13.8s	remaining: 10.3s
134:	learn: 0.0728502	total: 14s	remaining: 10.3s
135:	learn: 0.0725873	total: 14.1s	remaining: 10.1s
136:	learn: 0.0722890	total: 14.2s	remaining: 10s
137:	learn: 0.0718892	total: 14.3s	remaining: 9.92s
138:	learn: 0.0714727	total: 14.3s	remaining: 9.8s
139:	learn: 0.0711523	total: 14.4s	remaining: 9.68s
140:	learn: 0.0707633	total: 14.5s	remaining: 9.57s
141:	learn: 0.0704864	total: 14.6s	remaining: 9.46s
142:	learn: 0.0702022	total: 14.7s	remaining: 9.34s
143:	learn: 0.0698403	total: 14.8s	remaining: 9.23s
144:	learn: 0.0693409	total: 14.8s	remaining: 9.11s
145:	learn: 0.0690538	total: 14.9s	remaining: 9s
146:	learn: 0.0686946	total: 15s	remaining: 8.89s
147:	learn: 0.0682735	total: 15.1s	remaining: 8.78s
148:	learn: 0.0678550	total: 15.2s	remaining: 8.67s
149:	learn: 0.0675737	total: 15.3s	remaining: 8.56s
150:	learn: 0.0673307	total: 15.4s	remaining: 8.44s
151:	learn: 0.0668660	total: 15.4s	remaining: 8.33s
152:	learn: 0.0665451	total: 15.5s	remaining: 8.22s
153:	learn: 0.0662109	total: 15.6s	remaining: 8.11s
154:	learn: 0.0658701	total: 15.7s	remaining: 8s
155:	learn: 0.0656649	total: 15.8s	remaining: 7.89s
156:	learn: 0.0653434	total: 15.9s	remaining: 7.78s
157:	learn: 0.0648772	total: 16s	remaining: 7.67s
158:	learn: 0.0644438	total: 16.1s	remaining: 7.57s
159:	learn: 0.0640166	total: 16.2s	remaining: 7.47s
160:	learn: 0.0637597	total: 16.2s	remaining: 7.36s
161:	learn: 0.0634433	total: 16.3s	remaining: 7.25s
162:	learn: 0.0632154	total: 16.4s	remaining: 7.14s
163:	learn: 0.0629790	total: 16.5s	remaining: 7.03s
164:	learn: 0.0626825	total: 16.6s	remaining: 6.93s
165:	learn: 0.0623911	total: 16.7s	remaining: 6.82s
166:	learn: 0.0620674	total: 16.7s	remaining: 6.71s
167:	learn: 0.0618419	total: 16.8s	remaining: 6.61s
168:	learn: 0.0616297	total: 16.9s	remaining: 6.5s
169:	learn: 0.0614322	total: 17s	remaining: 6.39s
170:	learn: 0.0611908	total: 17.1s	remaining: 6.29s
171:	learn: 0.0609840	total: 17.1s	remaining: 6.18s
172:	learn: 0.0606225	total: 17.2s	remaining: 6.08s
173:	learn: 0.0603399	total: 17.3s	remaining: 5.98s
174:	learn: 0.0600008	total: 17.4s	remaining: 5.87s
175:	learn: 0.0597287	total: 17.5s	remaining: 5.77s
176:	learn: 0.0593830	total: 17.6s	remaining: 5.67s
177:	learn: 0.0590304	total: 17.7s	remaining: 5.56s
178:	learn: 0.0586949	total: 17.8s	remaining: 5.46s
179:	learn: 0.0582304	total: 17.9s	remaining: 5.36s
180:	learn: 0.0580462	total: 17.9s	remaining: 5.25s
181:	learn: 0.0577586	total: 18s	remaining: 5.14s
182:	learn: 0.0576141	total: 18.1s	remaining: 5.04s
183:	learn: 0.0574393	total: 18.2s	remaining: 4.93s
184:	learn: 0.0572162	total: 18.3s	remaining: 4.84s
185:	learn: 0.0569665	total: 18.4s	remaining: 4.74s
186:	learn: 0.0566881	total: 18.4s	remaining: 4.63s
187:	learn: 0.0565070	total: 18.5s	remaining: 4.53s
188:	learn: 0.0562179	total: 18.6s	remaining: 4.43s
189:	learn: 0.0559556	total: 18.7s	remaining: 4.33s
190:	learn: 0.0557091	total: 18.8s	remaining: 4.23s
191:	learn: 0.0555201	total: 18.9s	remaining: 4.13s
192:	learn: 0.0552827	total: 18.9s	remaining: 4.02s
193:	learn: 0.0551065	total: 19s	remaining: 3.92s
194:	learn: 0.0549101	total: 19.1s	remaining: 3.82s
195:	learn: 0.0546496	total: 19.2s	remaining: 3.72s
196:	learn: 0.0544008	total: 19.3s	remaining: 3.62s
197:	learn: 0.0542331	total: 19.4s	remaining: 3.52s
198:	learn: 0.0541034	total: 19.5s	remaining: 3.42s
199:	learn: 0.0538991	total: 19.5s	remaining: 3.32s
200:	learn: 0.0536342	total: 19.6s	remaining: 3.22s
201:	learn: 0.0533979	total: 19.7s	remaining: 3.12s
202:	learn: 0.0531229	total: 19.8s	remaining: 3.02s
203:	learn: 0.0529180	total: 19.9s	remaining: 2.92s
204:	learn: 0.0527483	total: 20s	remaining: 2.82s
205:	learn: 0.0524771	total: 20s	remaining: 2.72s
206:	learn: 0.0523423	total: 20.1s	remaining: 2.63s
207:	learn: 0.0521729	total: 20.2s	remaining: 2.52s
208:	learn: 0.0519076	total: 20.3s	remaining: 2.43s
209:	learn: 0.0517288	total: 20.4s	remaining: 2.33s
210:	learn: 0.0514981	total: 20.5s	remaining: 2.23s
211:	learn: 0.0512364	total: 20.5s	remaining: 2.13s
212:	learn: 0.0510980	total: 20.6s	remaining: 2.03s
213:	learn: 0.0509448	total: 20.7s	remaining: 1.94s
214:	learn: 0.0508026	total: 20.8s	remaining: 1.84s
215:	learn: 0.0506345	total: 20.9s	remaining: 1.74s
216:	learn: 0.0504268	total: 21s	remaining: 1.64s
217:	learn: 0.0501464	total: 21.1s	remaining: 1.54s
218:	learn: 0.0500008	total: 21.1s	remaining: 1.45s
219:	learn: 0.0497733	total: 21.2s	remaining: 1.35s
220:	learn: 0.0496119	total: 21.3s	remaining: 1.25s
221:	learn: 0.0494818	total: 21.4s	remaining: 1.16s
222:	learn: 0.0492792	total: 21.5s	remaining: 1.06s
223:	learn: 0.0490285	total: 21.6s	remaining: 962ms
224:	learn: 0.0488251	total: 21.6s	remaining: 866ms
225:	learn: 0.0485472	total: 21.7s	remaining: 769ms
226:	learn: 0.0483062	total: 21.8s	remaining: 672ms
227:	learn: 0.0480667	total: 21.9s	remaining: 576ms
228:	learn: 0.0478921	total: 22s	remaining: 480ms
229:	learn: 0.0477121	total: 22.1s	remaining: 384ms
230:	learn: 0.0475287	total: 22.2s	remaining: 288ms
231:	learn: 0.0473191	total: 22.2s	remaining: 192ms
232:	learn: 0.0471379	total: 22.3s	remaining: 95.8ms
233:	learn: 0.0469942	total: 22.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.69
 - Recall_Train: 100.00
 - AUPRC_Train: 99.82
 - Accuracy_Train: 97.19
 - F1-Score_Train: 97.27
 - Precision_Test: 2.83
 - Recall_Test: 96.03
 - AUPRC_Test: 60.64
 - Accuracy_Test: 94.44
 - F1-Score_Test: 5.49
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 234
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6657628	total: 154ms	remaining: 35.8s
1:	learn: 0.6351756	total: 310ms	remaining: 36s
2:	learn: 0.6075162	total: 472ms	remaining: 36.3s
3:	learn: 0.5846874	total: 625ms	remaining: 35.9s
4:	learn: 0.5616980	total: 785ms	remaining: 36s
5:	learn: 0.5449695	total: 922ms	remaining: 35s
6:	learn: 0.5243615	total: 1.07s	remaining: 34.6s
7:	learn: 0.5021051	total: 1.21s	remaining: 34.1s
8:	learn: 0.4804896	total: 1.36s	remaining: 33.9s
9:	learn: 0.4668107	total: 1.51s	remaining: 33.8s
10:	learn: 0.4507437	total: 1.66s	remaining: 33.6s
11:	learn: 0.4328254	total: 1.81s	remaining: 33.5s
12:	learn: 0.4161536	total: 1.96s	remaining: 33.3s
13:	learn: 0.4003014	total: 2.09s	remaining: 32.9s
14:	learn: 0.3827893	total: 2.29s	remaining: 33.4s
15:	learn: 0.3683166	total: 2.45s	remaining: 33.3s
16:	learn: 0.3558582	total: 2.6s	remaining: 33.2s
17:	learn: 0.3441800	total: 2.75s	remaining: 33s
18:	learn: 0.3325142	total: 2.9s	remaining: 32.8s
19:	learn: 0.3216329	total: 3.06s	remaining: 32.7s
20:	learn: 0.3123000	total: 3.2s	remaining: 32.4s
21:	learn: 0.2992862	total: 3.36s	remaining: 32.4s
22:	learn: 0.2891991	total: 3.51s	remaining: 32.2s
23:	learn: 0.2795096	total: 3.67s	remaining: 32.1s
24:	learn: 0.2718537	total: 3.81s	remaining: 31.9s
25:	learn: 0.2651348	total: 3.96s	remaining: 31.7s
26:	learn: 0.2585182	total: 4.11s	remaining: 31.5s
27:	learn: 0.2517391	total: 4.26s	remaining: 31.3s
28:	learn: 0.2445790	total: 4.41s	remaining: 31.2s
29:	learn: 0.2384912	total: 4.55s	remaining: 30.9s
30:	learn: 0.2329362	total: 4.69s	remaining: 30.7s
31:	learn: 0.2268220	total: 4.83s	remaining: 30.5s
32:	learn: 0.2203484	total: 4.9s	remaining: 29.9s
33:	learn: 0.2151220	total: 4.99s	remaining: 29.4s
34:	learn: 0.2105527	total: 5.07s	remaining: 28.8s
35:	learn: 0.2059114	total: 5.15s	remaining: 28.3s
36:	learn: 0.2015844	total: 5.23s	remaining: 27.8s
37:	learn: 0.1965330	total: 5.31s	remaining: 27.4s
38:	learn: 0.1929085	total: 5.4s	remaining: 27s
39:	learn: 0.1884414	total: 5.47s	remaining: 26.5s
40:	learn: 0.1840696	total: 5.54s	remaining: 26.1s
41:	learn: 0.1807461	total: 5.63s	remaining: 25.7s
42:	learn: 0.1775139	total: 5.7s	remaining: 25.3s
43:	learn: 0.1725989	total: 5.78s	remaining: 25s
44:	learn: 0.1696515	total: 5.86s	remaining: 24.6s
45:	learn: 0.1662244	total: 5.95s	remaining: 24.3s
46:	learn: 0.1635782	total: 6.03s	remaining: 24s
47:	learn: 0.1611342	total: 6.13s	remaining: 23.8s
48:	learn: 0.1573890	total: 6.21s	remaining: 23.4s
49:	learn: 0.1553327	total: 6.29s	remaining: 23.2s
50:	learn: 0.1530404	total: 6.38s	remaining: 22.9s
51:	learn: 0.1500000	total: 6.45s	remaining: 22.6s
52:	learn: 0.1478628	total: 6.52s	remaining: 22.3s
53:	learn: 0.1458583	total: 6.61s	remaining: 22s
54:	learn: 0.1428178	total: 6.69s	remaining: 21.8s
55:	learn: 0.1404019	total: 6.77s	remaining: 21.5s
56:	learn: 0.1387301	total: 6.85s	remaining: 21.3s
57:	learn: 0.1369578	total: 6.92s	remaining: 21s
58:	learn: 0.1346414	total: 7s	remaining: 20.8s
59:	learn: 0.1325470	total: 7.09s	remaining: 20.6s
60:	learn: 0.1304350	total: 7.17s	remaining: 20.3s
61:	learn: 0.1280850	total: 7.25s	remaining: 20.1s
62:	learn: 0.1266834	total: 7.33s	remaining: 19.9s
63:	learn: 0.1249767	total: 7.4s	remaining: 19.7s
64:	learn: 0.1230757	total: 7.48s	remaining: 19.4s
65:	learn: 0.1210497	total: 7.56s	remaining: 19.3s
66:	learn: 0.1197671	total: 7.63s	remaining: 19s
67:	learn: 0.1181116	total: 7.7s	remaining: 18.8s
68:	learn: 0.1170971	total: 7.79s	remaining: 18.6s
69:	learn: 0.1150667	total: 7.86s	remaining: 18.4s
70:	learn: 0.1136327	total: 7.93s	remaining: 18.2s
71:	learn: 0.1121997	total: 8.02s	remaining: 18.1s
72:	learn: 0.1105997	total: 8.12s	remaining: 17.9s
73:	learn: 0.1090983	total: 8.19s	remaining: 17.7s
74:	learn: 0.1079758	total: 8.28s	remaining: 17.5s
75:	learn: 0.1068014	total: 8.35s	remaining: 17.4s
76:	learn: 0.1051733	total: 8.43s	remaining: 17.2s
77:	learn: 0.1035046	total: 8.52s	remaining: 17s
78:	learn: 0.1016948	total: 8.61s	remaining: 16.9s
79:	learn: 0.1003777	total: 8.69s	remaining: 16.7s
80:	learn: 0.0992833	total: 8.78s	remaining: 16.6s
81:	learn: 0.0981113	total: 8.86s	remaining: 16.4s
82:	learn: 0.0967285	total: 8.94s	remaining: 16.3s
83:	learn: 0.0951386	total: 9.03s	remaining: 16.1s
84:	learn: 0.0940777	total: 9.12s	remaining: 16s
85:	learn: 0.0930975	total: 9.21s	remaining: 15.8s
86:	learn: 0.0925301	total: 9.29s	remaining: 15.7s
87:	learn: 0.0917714	total: 9.38s	remaining: 15.6s
88:	learn: 0.0904642	total: 9.46s	remaining: 15.4s
89:	learn: 0.0896401	total: 9.55s	remaining: 15.3s
90:	learn: 0.0887356	total: 9.62s	remaining: 15.1s
91:	learn: 0.0877658	total: 9.72s	remaining: 15s
92:	learn: 0.0865589	total: 9.82s	remaining: 14.9s
93:	learn: 0.0857693	total: 9.91s	remaining: 14.8s
94:	learn: 0.0847156	total: 9.99s	remaining: 14.6s
95:	learn: 0.0838257	total: 10.1s	remaining: 14.5s
96:	learn: 0.0829672	total: 10.2s	remaining: 14.4s
97:	learn: 0.0823303	total: 10.3s	remaining: 14.3s
98:	learn: 0.0815187	total: 10.4s	remaining: 14.1s
99:	learn: 0.0807361	total: 10.4s	remaining: 14s
100:	learn: 0.0802153	total: 10.5s	remaining: 13.8s
101:	learn: 0.0791672	total: 10.6s	remaining: 13.7s
102:	learn: 0.0783211	total: 10.7s	remaining: 13.6s
103:	learn: 0.0774271	total: 10.8s	remaining: 13.5s
104:	learn: 0.0767127	total: 10.9s	remaining: 13.4s
105:	learn: 0.0759825	total: 11s	remaining: 13.2s
106:	learn: 0.0752839	total: 11s	remaining: 13.1s
107:	learn: 0.0743354	total: 11.2s	remaining: 13s
108:	learn: 0.0737663	total: 11.2s	remaining: 12.9s
109:	learn: 0.0729655	total: 11.3s	remaining: 12.8s
110:	learn: 0.0724120	total: 11.4s	remaining: 12.6s
111:	learn: 0.0720847	total: 11.5s	remaining: 12.5s
112:	learn: 0.0714896	total: 11.5s	remaining: 12.4s
113:	learn: 0.0710371	total: 11.6s	remaining: 12.3s
114:	learn: 0.0706186	total: 11.7s	remaining: 12.1s
115:	learn: 0.0701842	total: 11.8s	remaining: 12s
116:	learn: 0.0694054	total: 11.9s	remaining: 11.9s
117:	learn: 0.0690871	total: 12s	remaining: 11.8s
118:	learn: 0.0685424	total: 12.1s	remaining: 11.7s
119:	learn: 0.0680081	total: 12.2s	remaining: 11.6s
120:	learn: 0.0674077	total: 12.3s	remaining: 11.5s
121:	learn: 0.0669234	total: 12.3s	remaining: 11.3s
122:	learn: 0.0663787	total: 12.4s	remaining: 11.2s
123:	learn: 0.0660378	total: 12.5s	remaining: 11.1s
124:	learn: 0.0655185	total: 12.6s	remaining: 11s
125:	learn: 0.0651616	total: 12.7s	remaining: 10.9s
126:	learn: 0.0647635	total: 12.8s	remaining: 10.8s
127:	learn: 0.0642098	total: 12.9s	remaining: 10.6s
128:	learn: 0.0638373	total: 12.9s	remaining: 10.5s
129:	learn: 0.0634138	total: 13s	remaining: 10.4s
130:	learn: 0.0627562	total: 13.1s	remaining: 10.3s
131:	learn: 0.0624240	total: 13.2s	remaining: 10.2s
132:	learn: 0.0621419	total: 13.3s	remaining: 10.1s
133:	learn: 0.0617555	total: 13.4s	remaining: 9.98s
134:	learn: 0.0614284	total: 13.5s	remaining: 9.88s
135:	learn: 0.0610084	total: 13.5s	remaining: 9.76s
136:	learn: 0.0606922	total: 13.6s	remaining: 9.65s
137:	learn: 0.0603637	total: 13.7s	remaining: 9.54s
138:	learn: 0.0599236	total: 13.8s	remaining: 9.44s
139:	learn: 0.0596185	total: 13.9s	remaining: 9.32s
140:	learn: 0.0592809	total: 14s	remaining: 9.21s
141:	learn: 0.0590540	total: 14s	remaining: 9.1s
142:	learn: 0.0588124	total: 14.1s	remaining: 8.98s
143:	learn: 0.0583528	total: 14.2s	remaining: 8.89s
144:	learn: 0.0581057	total: 14.3s	remaining: 8.78s
145:	learn: 0.0576358	total: 14.4s	remaining: 8.68s
146:	learn: 0.0574450	total: 14.5s	remaining: 8.57s
147:	learn: 0.0572096	total: 14.6s	remaining: 8.46s
148:	learn: 0.0569545	total: 14.6s	remaining: 8.35s
149:	learn: 0.0565230	total: 14.7s	remaining: 8.25s
150:	learn: 0.0562209	total: 14.8s	remaining: 8.16s
151:	learn: 0.0558264	total: 15s	remaining: 8.09s
152:	learn: 0.0554189	total: 15.2s	remaining: 8.02s
153:	learn: 0.0550912	total: 15.3s	remaining: 7.96s
154:	learn: 0.0548774	total: 15.4s	remaining: 7.87s
155:	learn: 0.0546111	total: 15.6s	remaining: 7.8s
156:	learn: 0.0543455	total: 15.8s	remaining: 7.73s
157:	learn: 0.0540843	total: 15.9s	remaining: 7.66s
158:	learn: 0.0538476	total: 16.1s	remaining: 7.59s
159:	learn: 0.0535154	total: 16.3s	remaining: 7.52s
160:	learn: 0.0531661	total: 16.4s	remaining: 7.44s
161:	learn: 0.0529827	total: 16.6s	remaining: 7.36s
162:	learn: 0.0527830	total: 16.7s	remaining: 7.28s
163:	learn: 0.0524438	total: 16.9s	remaining: 7.2s
164:	learn: 0.0522938	total: 17s	remaining: 7.12s
165:	learn: 0.0520508	total: 17.2s	remaining: 7.04s
166:	learn: 0.0517295	total: 17.3s	remaining: 6.96s
167:	learn: 0.0514751	total: 17.5s	remaining: 6.86s
168:	learn: 0.0513195	total: 17.6s	remaining: 6.76s
169:	learn: 0.0511824	total: 17.7s	remaining: 6.68s
170:	learn: 0.0509668	total: 17.9s	remaining: 6.59s
171:	learn: 0.0507464	total: 18s	remaining: 6.5s
172:	learn: 0.0504621	total: 18.2s	remaining: 6.42s
173:	learn: 0.0503311	total: 18.3s	remaining: 6.33s
174:	learn: 0.0501248	total: 18.5s	remaining: 6.23s
175:	learn: 0.0498716	total: 18.6s	remaining: 6.13s
176:	learn: 0.0496171	total: 18.8s	remaining: 6.04s
177:	learn: 0.0493381	total: 18.9s	remaining: 5.95s
178:	learn: 0.0491383	total: 19.1s	remaining: 5.86s
179:	learn: 0.0488637	total: 19.2s	remaining: 5.77s
180:	learn: 0.0486335	total: 19.4s	remaining: 5.68s
181:	learn: 0.0484698	total: 19.6s	remaining: 5.59s
182:	learn: 0.0483005	total: 19.7s	remaining: 5.5s
183:	learn: 0.0481769	total: 19.9s	remaining: 5.4s
184:	learn: 0.0479867	total: 20s	remaining: 5.29s
185:	learn: 0.0477177	total: 20s	remaining: 5.17s
186:	learn: 0.0475272	total: 20.1s	remaining: 5.06s
187:	learn: 0.0473238	total: 20.2s	remaining: 4.95s
188:	learn: 0.0471181	total: 20.3s	remaining: 4.83s
189:	learn: 0.0468525	total: 20.4s	remaining: 4.72s
190:	learn: 0.0466148	total: 20.5s	remaining: 4.61s
191:	learn: 0.0464772	total: 20.6s	remaining: 4.5s
192:	learn: 0.0463162	total: 20.7s	remaining: 4.39s
193:	learn: 0.0461035	total: 20.7s	remaining: 4.28s
194:	learn: 0.0458443	total: 20.8s	remaining: 4.16s
195:	learn: 0.0456570	total: 20.9s	remaining: 4.05s
196:	learn: 0.0454570	total: 21s	remaining: 3.94s
197:	learn: 0.0453485	total: 21.1s	remaining: 3.83s
198:	learn: 0.0451615	total: 21.2s	remaining: 3.72s
199:	learn: 0.0450468	total: 21.2s	remaining: 3.61s
200:	learn: 0.0449151	total: 21.3s	remaining: 3.5s
201:	learn: 0.0446932	total: 21.4s	remaining: 3.39s
202:	learn: 0.0445444	total: 21.5s	remaining: 3.28s
203:	learn: 0.0443807	total: 21.6s	remaining: 3.17s
204:	learn: 0.0442248	total: 21.7s	remaining: 3.06s
205:	learn: 0.0439945	total: 21.7s	remaining: 2.96s
206:	learn: 0.0438428	total: 21.8s	remaining: 2.85s
207:	learn: 0.0436542	total: 21.9s	remaining: 2.74s
208:	learn: 0.0434279	total: 22s	remaining: 2.63s
209:	learn: 0.0432781	total: 22.1s	remaining: 2.52s
210:	learn: 0.0430718	total: 22.2s	remaining: 2.42s
211:	learn: 0.0428398	total: 22.3s	remaining: 2.31s
212:	learn: 0.0427161	total: 22.3s	remaining: 2.2s
213:	learn: 0.0425798	total: 22.4s	remaining: 2.1s
214:	learn: 0.0424582	total: 22.5s	remaining: 1.99s
215:	learn: 0.0423466	total: 22.6s	remaining: 1.88s
216:	learn: 0.0421729	total: 22.7s	remaining: 1.78s
217:	learn: 0.0419883	total: 22.8s	remaining: 1.67s
218:	learn: 0.0417915	total: 22.8s	remaining: 1.56s
219:	learn: 0.0416601	total: 22.9s	remaining: 1.46s
220:	learn: 0.0415243	total: 23s	remaining: 1.35s
221:	learn: 0.0413463	total: 23.1s	remaining: 1.25s
222:	learn: 0.0412310	total: 23.2s	remaining: 1.14s
223:	learn: 0.0410331	total: 23.3s	remaining: 1.04s
224:	learn: 0.0408559	total: 23.3s	remaining: 934ms
225:	learn: 0.0406727	total: 23.4s	remaining: 830ms
226:	learn: 0.0405200	total: 23.5s	remaining: 725ms
227:	learn: 0.0403891	total: 23.6s	remaining: 621ms
228:	learn: 0.0402255	total: 23.7s	remaining: 517ms
229:	learn: 0.0400459	total: 23.8s	remaining: 414ms
230:	learn: 0.0398960	total: 23.9s	remaining: 310ms
231:	learn: 0.0397081	total: 24s	remaining: 207ms
232:	learn: 0.0395441	total: 24s	remaining: 103ms
233:	learn: 0.0394214	total: 24.1s	remaining: 0us
[I 2024-12-19 14:18:12,511] Trial 9 finished with value: 65.83384449024851 and parameters: {'learning_rate': 0.011217151265462048, 'max_depth': 5, 'n_estimators': 234, 'scale_pos_weight': 6.4905218679657}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.29
 - Recall_Train: 99.82
 - AUPRC_Train: 99.82
 - Accuracy_Train: 97.44
 - F1-Score_Train: 97.50
 - Precision_Test: 2.93
 - Recall_Test: 90.48
 - AUPRC_Test: 67.52
 - Accuracy_Test: 94.94
 - F1-Score_Test: 5.68
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 234
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.01
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 65.8338

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4931447	total: 64.6ms	remaining: 11s
1:	learn: 0.3492495	total: 132ms	remaining: 11.1s
2:	learn: 0.2603746	total: 198ms	remaining: 11.1s
3:	learn: 0.1927722	total: 267ms	remaining: 11.1s
4:	learn: 0.1703140	total: 338ms	remaining: 11.2s
5:	learn: 0.1424015	total: 401ms	remaining: 11s
6:	learn: 0.1317398	total: 492ms	remaining: 11.5s
7:	learn: 0.1180704	total: 562ms	remaining: 11.5s
8:	learn: 0.1069296	total: 633ms	remaining: 11.4s
9:	learn: 0.0964473	total: 710ms	remaining: 11.4s
10:	learn: 0.0896152	total: 769ms	remaining: 11.2s
11:	learn: 0.0857610	total: 837ms	remaining: 11.1s
12:	learn: 0.0783138	total: 912ms	remaining: 11.1s
13:	learn: 0.0747081	total: 993ms	remaining: 11.1s
14:	learn: 0.0720375	total: 1.05s	remaining: 10.9s
15:	learn: 0.0665161	total: 1.13s	remaining: 10.9s
16:	learn: 0.0644940	total: 1.2s	remaining: 10.8s
17:	learn: 0.0613822	total: 1.26s	remaining: 10.7s
18:	learn: 0.0596414	total: 1.32s	remaining: 10.5s
19:	learn: 0.0583584	total: 1.39s	remaining: 10.5s
20:	learn: 0.0551188	total: 1.48s	remaining: 10.5s
21:	learn: 0.0539783	total: 1.54s	remaining: 10.4s
22:	learn: 0.0527879	total: 1.62s	remaining: 10.4s
23:	learn: 0.0511348	total: 1.69s	remaining: 10.3s
24:	learn: 0.0496624	total: 1.74s	remaining: 10.2s
25:	learn: 0.0477373	total: 1.8s	remaining: 10.1s
26:	learn: 0.0463065	total: 1.89s	remaining: 10.1s
27:	learn: 0.0446538	total: 1.95s	remaining: 9.94s
28:	learn: 0.0431005	total: 2.01s	remaining: 9.85s
29:	learn: 0.0421079	total: 2.07s	remaining: 9.73s
30:	learn: 0.0409999	total: 2.14s	remaining: 9.68s
31:	learn: 0.0399840	total: 2.21s	remaining: 9.59s
32:	learn: 0.0393520	total: 2.27s	remaining: 9.48s
33:	learn: 0.0387952	total: 2.33s	remaining: 9.39s
34:	learn: 0.0379361	total: 2.41s	remaining: 9.36s
35:	learn: 0.0370248	total: 2.47s	remaining: 9.28s
36:	learn: 0.0361418	total: 2.55s	remaining: 9.24s
37:	learn: 0.0357449	total: 2.6s	remaining: 9.11s
38:	learn: 0.0349703	total: 2.67s	remaining: 9.05s
39:	learn: 0.0344265	total: 2.73s	remaining: 8.94s
40:	learn: 0.0338542	total: 2.79s	remaining: 8.85s
41:	learn: 0.0334274	total: 2.85s	remaining: 8.77s
42:	learn: 0.0328394	total: 2.94s	remaining: 8.74s
43:	learn: 0.0323049	total: 3s	remaining: 8.66s
44:	learn: 0.0318041	total: 3.07s	remaining: 8.59s
45:	learn: 0.0312415	total: 3.13s	remaining: 8.5s
46:	learn: 0.0306030	total: 3.2s	remaining: 8.44s
47:	learn: 0.0303391	total: 3.26s	remaining: 8.37s
48:	learn: 0.0297769	total: 3.32s	remaining: 8.28s
49:	learn: 0.0293982	total: 3.39s	remaining: 8.2s
50:	learn: 0.0289911	total: 3.46s	remaining: 8.14s
51:	learn: 0.0284388	total: 3.53s	remaining: 8.08s
52:	learn: 0.0280961	total: 3.6s	remaining: 8.03s
53:	learn: 0.0278366	total: 3.72s	remaining: 8.05s
54:	learn: 0.0273457	total: 3.8s	remaining: 8.02s
55:	learn: 0.0268342	total: 3.92s	remaining: 8.04s
56:	learn: 0.0263958	total: 4.02s	remaining: 8.04s
57:	learn: 0.0260730	total: 4.14s	remaining: 8.06s
58:	learn: 0.0257906	total: 4.28s	remaining: 8.13s
59:	learn: 0.0253991	total: 4.41s	remaining: 8.15s
60:	learn: 0.0249890	total: 4.54s	remaining: 8.19s
61:	learn: 0.0247996	total: 4.66s	remaining: 8.19s
62:	learn: 0.0244937	total: 4.79s	remaining: 8.21s
63:	learn: 0.0242373	total: 4.95s	remaining: 8.27s
64:	learn: 0.0239568	total: 5.06s	remaining: 8.26s
65:	learn: 0.0235877	total: 5.19s	remaining: 8.26s
66:	learn: 0.0233144	total: 5.31s	remaining: 8.25s
67:	learn: 0.0230408	total: 5.43s	remaining: 8.23s
68:	learn: 0.0227902	total: 5.56s	remaining: 8.21s
69:	learn: 0.0225840	total: 5.68s	remaining: 8.2s
70:	learn: 0.0222240	total: 5.82s	remaining: 8.19s
71:	learn: 0.0220711	total: 5.95s	remaining: 8.18s
72:	learn: 0.0217721	total: 6.07s	remaining: 8.14s
73:	learn: 0.0215654	total: 6.19s	remaining: 8.12s
74:	learn: 0.0212900	total: 6.3s	remaining: 8.07s
75:	learn: 0.0210628	total: 6.45s	remaining: 8.06s
76:	learn: 0.0208469	total: 6.57s	remaining: 8.02s
77:	learn: 0.0206742	total: 6.7s	remaining: 7.98s
78:	learn: 0.0204712	total: 6.81s	remaining: 7.93s
79:	learn: 0.0201682	total: 6.95s	remaining: 7.91s
80:	learn: 0.0200317	total: 7.04s	remaining: 7.82s
81:	learn: 0.0198529	total: 7.18s	remaining: 7.8s
82:	learn: 0.0196969	total: 7.3s	remaining: 7.74s
83:	learn: 0.0195331	total: 7.44s	remaining: 7.7s
84:	learn: 0.0192373	total: 7.56s	remaining: 7.65s
85:	learn: 0.0191491	total: 7.65s	remaining: 7.56s
86:	learn: 0.0189374	total: 7.76s	remaining: 7.5s
87:	learn: 0.0187920	total: 7.9s	remaining: 7.45s
88:	learn: 0.0186412	total: 8.02s	remaining: 7.39s
89:	learn: 0.0184625	total: 8.15s	remaining: 7.33s
90:	learn: 0.0183185	total: 8.27s	remaining: 7.27s
91:	learn: 0.0181820	total: 8.4s	remaining: 7.21s
92:	learn: 0.0179554	total: 8.52s	remaining: 7.15s
93:	learn: 0.0178410	total: 8.65s	remaining: 7.09s
94:	learn: 0.0176752	total: 8.78s	remaining: 7.02s
95:	learn: 0.0174744	total: 8.9s	remaining: 6.96s
96:	learn: 0.0173363	total: 9.03s	remaining: 6.89s
97:	learn: 0.0171956	total: 9.15s	remaining: 6.82s
98:	learn: 0.0170765	total: 9.22s	remaining: 6.7s
99:	learn: 0.0169023	total: 9.28s	remaining: 6.59s
100:	learn: 0.0167968	total: 9.34s	remaining: 6.47s
101:	learn: 0.0166589	total: 9.41s	remaining: 6.37s
102:	learn: 0.0163749	total: 9.47s	remaining: 6.25s
103:	learn: 0.0161709	total: 9.53s	remaining: 6.14s
104:	learn: 0.0160826	total: 9.6s	remaining: 6.03s
105:	learn: 0.0159124	total: 9.67s	remaining: 5.93s
106:	learn: 0.0158020	total: 9.73s	remaining: 5.82s
107:	learn: 0.0154388	total: 9.81s	remaining: 5.72s
108:	learn: 0.0152595	total: 9.89s	remaining: 5.62s
109:	learn: 0.0151347	total: 9.95s	remaining: 5.52s
110:	learn: 0.0150226	total: 10s	remaining: 5.41s
111:	learn: 0.0148405	total: 10.1s	remaining: 5.3s
112:	learn: 0.0146064	total: 10.1s	remaining: 5.21s
113:	learn: 0.0145132	total: 10.2s	remaining: 5.1s
114:	learn: 0.0143317	total: 10.3s	remaining: 5s
115:	learn: 0.0142237	total: 10.3s	remaining: 4.9s
116:	learn: 0.0141100	total: 10.4s	remaining: 4.8s
117:	learn: 0.0139964	total: 10.5s	remaining: 4.7s
118:	learn: 0.0139080	total: 10.5s	remaining: 4.6s
119:	learn: 0.0137770	total: 10.6s	remaining: 4.5s
120:	learn: 0.0136328	total: 10.7s	remaining: 4.41s
121:	learn: 0.0134504	total: 10.7s	remaining: 4.31s
122:	learn: 0.0133383	total: 10.8s	remaining: 4.21s
123:	learn: 0.0131922	total: 10.9s	remaining: 4.12s
124:	learn: 0.0130852	total: 10.9s	remaining: 4.03s
125:	learn: 0.0130058	total: 11s	remaining: 3.93s
126:	learn: 0.0128811	total: 11.1s	remaining: 3.84s
127:	learn: 0.0127077	total: 11.1s	remaining: 3.74s
128:	learn: 0.0124797	total: 11.2s	remaining: 3.65s
129:	learn: 0.0123642	total: 11.3s	remaining: 3.55s
130:	learn: 0.0122925	total: 11.3s	remaining: 3.46s
131:	learn: 0.0122160	total: 11.4s	remaining: 3.37s
132:	learn: 0.0121238	total: 11.5s	remaining: 3.28s
133:	learn: 0.0120208	total: 11.5s	remaining: 3.18s
134:	learn: 0.0119221	total: 11.6s	remaining: 3.09s
135:	learn: 0.0118542	total: 11.7s	remaining: 3s
136:	learn: 0.0117674	total: 11.7s	remaining: 2.91s
137:	learn: 0.0116964	total: 11.8s	remaining: 2.82s
138:	learn: 0.0116457	total: 11.9s	remaining: 2.73s
139:	learn: 0.0115113	total: 11.9s	remaining: 2.64s
140:	learn: 0.0114056	total: 12s	remaining: 2.55s
141:	learn: 0.0113271	total: 12.1s	remaining: 2.46s
142:	learn: 0.0112777	total: 12.1s	remaining: 2.37s
143:	learn: 0.0111578	total: 12.2s	remaining: 2.28s
144:	learn: 0.0110597	total: 12.2s	remaining: 2.19s
145:	learn: 0.0109568	total: 12.3s	remaining: 2.11s
146:	learn: 0.0108991	total: 12.4s	remaining: 2.02s
147:	learn: 0.0108281	total: 12.4s	remaining: 1.93s
148:	learn: 0.0107231	total: 12.5s	remaining: 1.84s
149:	learn: 0.0106689	total: 12.6s	remaining: 1.76s
150:	learn: 0.0105294	total: 12.6s	remaining: 1.67s
151:	learn: 0.0104330	total: 12.7s	remaining: 1.59s
152:	learn: 0.0103613	total: 12.8s	remaining: 1.5s
153:	learn: 0.0102649	total: 12.8s	remaining: 1.42s
154:	learn: 0.0102086	total: 12.9s	remaining: 1.33s
155:	learn: 0.0101044	total: 13s	remaining: 1.25s
156:	learn: 0.0100384	total: 13s	remaining: 1.16s
157:	learn: 0.0099727	total: 13.1s	remaining: 1.08s
158:	learn: 0.0099215	total: 13.2s	remaining: 995ms
159:	learn: 0.0098397	total: 13.3s	remaining: 911ms
160:	learn: 0.0097745	total: 13.3s	remaining: 827ms
161:	learn: 0.0096846	total: 13.4s	remaining: 743ms
162:	learn: 0.0096177	total: 13.4s	remaining: 660ms
163:	learn: 0.0095527	total: 13.5s	remaining: 576ms
164:	learn: 0.0094816	total: 13.6s	remaining: 493ms
165:	learn: 0.0094031	total: 13.6s	remaining: 410ms
166:	learn: 0.0093005	total: 13.7s	remaining: 328ms
167:	learn: 0.0092440	total: 13.8s	remaining: 246ms
168:	learn: 0.0091498	total: 13.8s	remaining: 164ms
169:	learn: 0.0090964	total: 13.9s	remaining: 81.7ms
170:	learn: 0.0090010	total: 14s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.80
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 98.87
 - F1-Score_Train: 98.89
 - Precision_Test: 6.04
 - Recall_Test: 89.68
 - AUPRC_Test: 64.90
 - Accuracy_Test: 97.64
 - F1-Score_Test: 11.32
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 171
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.85
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4875881	total: 64ms	remaining: 10.9s
1:	learn: 0.3627264	total: 127ms	remaining: 10.8s
2:	learn: 0.2793745	total: 190ms	remaining: 10.6s
3:	learn: 0.2288357	total: 258ms	remaining: 10.8s
4:	learn: 0.1981843	total: 339ms	remaining: 11.2s
5:	learn: 0.1795265	total: 398ms	remaining: 10.9s
6:	learn: 0.1610315	total: 461ms	remaining: 10.8s
7:	learn: 0.1444768	total: 531ms	remaining: 10.8s
8:	learn: 0.1306401	total: 601ms	remaining: 10.8s
9:	learn: 0.1245059	total: 661ms	remaining: 10.6s
10:	learn: 0.1182944	total: 722ms	remaining: 10.5s
11:	learn: 0.1099442	total: 782ms	remaining: 10.4s
12:	learn: 0.1049865	total: 858ms	remaining: 10.4s
13:	learn: 0.1000673	total: 916ms	remaining: 10.3s
14:	learn: 0.0963954	total: 979ms	remaining: 10.2s
15:	learn: 0.0894304	total: 1.06s	remaining: 10.3s
16:	learn: 0.0861254	total: 1.12s	remaining: 10.1s
17:	learn: 0.0824028	total: 1.18s	remaining: 10s
18:	learn: 0.0792243	total: 1.25s	remaining: 9.98s
19:	learn: 0.0770443	total: 1.32s	remaining: 9.95s
20:	learn: 0.0759849	total: 1.38s	remaining: 9.83s
21:	learn: 0.0743250	total: 1.44s	remaining: 9.74s
22:	learn: 0.0711447	total: 1.5s	remaining: 9.66s
23:	learn: 0.0696962	total: 1.57s	remaining: 9.65s
24:	learn: 0.0679837	total: 1.65s	remaining: 9.61s
25:	learn: 0.0660644	total: 1.71s	remaining: 9.52s
26:	learn: 0.0643828	total: 1.77s	remaining: 9.44s
27:	learn: 0.0632586	total: 1.84s	remaining: 9.41s
28:	learn: 0.0618741	total: 1.91s	remaining: 9.34s
29:	learn: 0.0609437	total: 1.97s	remaining: 9.26s
30:	learn: 0.0599839	total: 2.03s	remaining: 9.15s
31:	learn: 0.0590148	total: 2.11s	remaining: 9.18s
32:	learn: 0.0578942	total: 2.18s	remaining: 9.1s
33:	learn: 0.0564762	total: 2.24s	remaining: 9.03s
34:	learn: 0.0556393	total: 2.3s	remaining: 8.93s
35:	learn: 0.0545738	total: 2.37s	remaining: 8.88s
36:	learn: 0.0536779	total: 2.43s	remaining: 8.79s
37:	learn: 0.0530452	total: 2.49s	remaining: 8.7s
38:	learn: 0.0523212	total: 2.55s	remaining: 8.62s
39:	learn: 0.0517684	total: 2.63s	remaining: 8.6s
40:	learn: 0.0508848	total: 2.69s	remaining: 8.53s
41:	learn: 0.0501330	total: 2.75s	remaining: 8.44s
42:	learn: 0.0496563	total: 2.81s	remaining: 8.38s
43:	learn: 0.0490617	total: 2.89s	remaining: 8.34s
44:	learn: 0.0479837	total: 2.95s	remaining: 8.26s
45:	learn: 0.0474161	total: 3.01s	remaining: 8.18s
46:	learn: 0.0466837	total: 3.07s	remaining: 8.11s
47:	learn: 0.0461243	total: 3.19s	remaining: 8.17s
48:	learn: 0.0456382	total: 3.3s	remaining: 8.21s
49:	learn: 0.0450283	total: 3.42s	remaining: 8.28s
50:	learn: 0.0443931	total: 3.54s	remaining: 8.32s
51:	learn: 0.0434185	total: 3.68s	remaining: 8.42s
52:	learn: 0.0428631	total: 3.82s	remaining: 8.5s
53:	learn: 0.0424315	total: 3.96s	remaining: 8.59s
54:	learn: 0.0420739	total: 4.08s	remaining: 8.6s
55:	learn: 0.0416395	total: 4.22s	remaining: 8.67s
56:	learn: 0.0412348	total: 4.34s	remaining: 8.69s
57:	learn: 0.0408676	total: 4.46s	remaining: 8.69s
58:	learn: 0.0402379	total: 4.58s	remaining: 8.69s
59:	learn: 0.0399173	total: 4.7s	remaining: 8.7s
60:	learn: 0.0395532	total: 4.82s	remaining: 8.69s
61:	learn: 0.0389315	total: 4.95s	remaining: 8.7s
62:	learn: 0.0386131	total: 5.07s	remaining: 8.69s
63:	learn: 0.0377032	total: 5.18s	remaining: 8.66s
64:	learn: 0.0374575	total: 5.31s	remaining: 8.66s
65:	learn: 0.0371179	total: 5.45s	remaining: 8.67s
66:	learn: 0.0367441	total: 5.58s	remaining: 8.66s
67:	learn: 0.0363419	total: 5.68s	remaining: 8.6s
68:	learn: 0.0359115	total: 5.8s	remaining: 8.58s
69:	learn: 0.0354746	total: 5.93s	remaining: 8.56s
70:	learn: 0.0352163	total: 6.05s	remaining: 8.52s
71:	learn: 0.0349962	total: 6.2s	remaining: 8.53s
72:	learn: 0.0345959	total: 6.33s	remaining: 8.5s
73:	learn: 0.0341293	total: 6.46s	remaining: 8.46s
74:	learn: 0.0339342	total: 6.58s	remaining: 8.42s
75:	learn: 0.0334607	total: 6.69s	remaining: 8.37s
76:	learn: 0.0330210	total: 6.81s	remaining: 8.32s
77:	learn: 0.0323569	total: 6.94s	remaining: 8.28s
78:	learn: 0.0318348	total: 7.06s	remaining: 8.22s
79:	learn: 0.0314814	total: 7.19s	remaining: 8.18s
80:	learn: 0.0311924	total: 7.31s	remaining: 8.13s
81:	learn: 0.0306090	total: 7.44s	remaining: 8.08s
82:	learn: 0.0303145	total: 7.57s	remaining: 8.03s
83:	learn: 0.0300564	total: 7.69s	remaining: 7.96s
84:	learn: 0.0298337	total: 7.81s	remaining: 7.9s
85:	learn: 0.0295640	total: 7.91s	remaining: 7.82s
86:	learn: 0.0292550	total: 8.03s	remaining: 7.75s
87:	learn: 0.0287250	total: 8.14s	remaining: 7.68s
88:	learn: 0.0284895	total: 8.24s	remaining: 7.6s
89:	learn: 0.0283541	total: 8.38s	remaining: 7.54s
90:	learn: 0.0279076	total: 8.45s	remaining: 7.43s
91:	learn: 0.0275913	total: 8.51s	remaining: 7.31s
92:	learn: 0.0273790	total: 8.57s	remaining: 7.18s
93:	learn: 0.0270684	total: 8.64s	remaining: 7.08s
94:	learn: 0.0268749	total: 8.7s	remaining: 6.96s
95:	learn: 0.0265322	total: 8.77s	remaining: 6.85s
96:	learn: 0.0263149	total: 8.84s	remaining: 6.74s
97:	learn: 0.0260088	total: 8.91s	remaining: 6.64s
98:	learn: 0.0258349	total: 8.98s	remaining: 6.53s
99:	learn: 0.0254145	total: 9.04s	remaining: 6.42s
100:	learn: 0.0252126	total: 9.1s	remaining: 6.31s
101:	learn: 0.0250196	total: 9.17s	remaining: 6.21s
102:	learn: 0.0247996	total: 9.24s	remaining: 6.1s
103:	learn: 0.0245755	total: 9.3s	remaining: 5.99s
104:	learn: 0.0244264	total: 9.36s	remaining: 5.88s
105:	learn: 0.0242694	total: 9.44s	remaining: 5.79s
106:	learn: 0.0239054	total: 9.5s	remaining: 5.68s
107:	learn: 0.0236982	total: 9.56s	remaining: 5.58s
108:	learn: 0.0234935	total: 9.62s	remaining: 5.47s
109:	learn: 0.0233168	total: 9.69s	remaining: 5.37s
110:	learn: 0.0231048	total: 9.74s	remaining: 5.27s
111:	learn: 0.0229690	total: 9.8s	remaining: 5.16s
112:	learn: 0.0227630	total: 9.87s	remaining: 5.07s
113:	learn: 0.0226190	total: 9.94s	remaining: 4.97s
114:	learn: 0.0223518	total: 10s	remaining: 4.87s
115:	learn: 0.0221891	total: 10.1s	remaining: 4.77s
116:	learn: 0.0220489	total: 10.1s	remaining: 4.67s
117:	learn: 0.0219683	total: 10.2s	remaining: 4.58s
118:	learn: 0.0217547	total: 10.2s	remaining: 4.48s
119:	learn: 0.0214779	total: 10.3s	remaining: 4.38s
120:	learn: 0.0213068	total: 10.4s	remaining: 4.28s
121:	learn: 0.0211218	total: 10.4s	remaining: 4.2s
122:	learn: 0.0209834	total: 10.5s	remaining: 4.1s
123:	learn: 0.0208537	total: 10.6s	remaining: 4.01s
124:	learn: 0.0206839	total: 10.6s	remaining: 3.91s
125:	learn: 0.0205670	total: 10.7s	remaining: 3.82s
126:	learn: 0.0204571	total: 10.8s	remaining: 3.73s
127:	learn: 0.0203547	total: 10.8s	remaining: 3.63s
128:	learn: 0.0201543	total: 10.9s	remaining: 3.54s
129:	learn: 0.0199636	total: 11s	remaining: 3.45s
130:	learn: 0.0197055	total: 11s	remaining: 3.36s
131:	learn: 0.0195536	total: 11.1s	remaining: 3.27s
132:	learn: 0.0194819	total: 11.1s	remaining: 3.18s
133:	learn: 0.0193313	total: 11.2s	remaining: 3.09s
134:	learn: 0.0191947	total: 11.2s	remaining: 3s
135:	learn: 0.0190702	total: 11.3s	remaining: 2.91s
136:	learn: 0.0188842	total: 11.4s	remaining: 2.82s
137:	learn: 0.0186325	total: 11.4s	remaining: 2.73s
138:	learn: 0.0183524	total: 11.5s	remaining: 2.65s
139:	learn: 0.0181425	total: 11.6s	remaining: 2.56s
140:	learn: 0.0179117	total: 11.6s	remaining: 2.48s
141:	learn: 0.0178203	total: 11.7s	remaining: 2.39s
142:	learn: 0.0177412	total: 11.8s	remaining: 2.31s
143:	learn: 0.0175575	total: 11.8s	remaining: 2.22s
144:	learn: 0.0174430	total: 11.9s	remaining: 2.13s
145:	learn: 0.0173612	total: 12s	remaining: 2.05s
146:	learn: 0.0172711	total: 12s	remaining: 1.96s
147:	learn: 0.0171805	total: 12.1s	remaining: 1.88s
148:	learn: 0.0170806	total: 12.2s	remaining: 1.79s
149:	learn: 0.0168858	total: 12.2s	remaining: 1.71s
150:	learn: 0.0167471	total: 12.3s	remaining: 1.63s
151:	learn: 0.0165685	total: 12.3s	remaining: 1.54s
152:	learn: 0.0165025	total: 12.4s	remaining: 1.46s
153:	learn: 0.0163621	total: 12.5s	remaining: 1.38s
154:	learn: 0.0162242	total: 12.5s	remaining: 1.29s
155:	learn: 0.0161052	total: 12.6s	remaining: 1.21s
156:	learn: 0.0160098	total: 12.7s	remaining: 1.13s
157:	learn: 0.0158384	total: 12.7s	remaining: 1.05s
158:	learn: 0.0157586	total: 12.8s	remaining: 966ms
159:	learn: 0.0156932	total: 12.9s	remaining: 885ms
160:	learn: 0.0155653	total: 12.9s	remaining: 804ms
161:	learn: 0.0154435	total: 13s	remaining: 723ms
162:	learn: 0.0153412	total: 13.1s	remaining: 641ms
163:	learn: 0.0152288	total: 13.1s	remaining: 561ms
164:	learn: 0.0151638	total: 13.2s	remaining: 480ms
165:	learn: 0.0150606	total: 13.3s	remaining: 400ms
166:	learn: 0.0149658	total: 13.3s	remaining: 319ms
167:	learn: 0.0148767	total: 13.4s	remaining: 239ms
168:	learn: 0.0148017	total: 13.5s	remaining: 159ms
169:	learn: 0.0147279	total: 13.6s	remaining: 79.9ms
170:	learn: 0.0146430	total: 13.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.75
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.32
 - F1-Score_Train: 98.35
 - Precision_Test: 4.41
 - Recall_Test: 92.86
 - AUPRC_Test: 70.98
 - Accuracy_Test: 96.61
 - F1-Score_Test: 8.43
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 171
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.85
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4908426	total: 63.3ms	remaining: 10.8s
1:	learn: 0.3821256	total: 122ms	remaining: 10.3s
2:	learn: 0.2748871	total: 183ms	remaining: 10.3s
3:	learn: 0.2172869	total: 244ms	remaining: 10.2s
4:	learn: 0.1704711	total: 317ms	remaining: 10.5s
5:	learn: 0.1445407	total: 380ms	remaining: 10.4s
6:	learn: 0.1259661	total: 438ms	remaining: 10.3s
7:	learn: 0.1173258	total: 498ms	remaining: 10.1s
8:	learn: 0.1103116	total: 569ms	remaining: 10.2s
9:	learn: 0.1001256	total: 640ms	remaining: 10.3s
10:	learn: 0.0905696	total: 704ms	remaining: 10.2s
11:	learn: 0.0868017	total: 764ms	remaining: 10.1s
12:	learn: 0.0822838	total: 828ms	remaining: 10.1s
13:	learn: 0.0789555	total: 891ms	remaining: 9.99s
14:	learn: 0.0741355	total: 972ms	remaining: 10.1s
15:	learn: 0.0711965	total: 1.04s	remaining: 10.1s
16:	learn: 0.0675978	total: 1.11s	remaining: 10.1s
17:	learn: 0.0643371	total: 1.17s	remaining: 9.98s
18:	learn: 0.0628901	total: 1.24s	remaining: 9.89s
19:	learn: 0.0616488	total: 1.31s	remaining: 9.86s
20:	learn: 0.0602028	total: 1.37s	remaining: 9.78s
21:	learn: 0.0591373	total: 1.43s	remaining: 9.7s
22:	learn: 0.0574744	total: 1.49s	remaining: 9.6s
23:	learn: 0.0560437	total: 1.56s	remaining: 9.59s
24:	learn: 0.0541678	total: 1.63s	remaining: 9.49s
25:	learn: 0.0529394	total: 1.68s	remaining: 9.38s
26:	learn: 0.0519762	total: 1.75s	remaining: 9.31s
27:	learn: 0.0513441	total: 1.82s	remaining: 9.3s
28:	learn: 0.0504238	total: 1.88s	remaining: 9.23s
29:	learn: 0.0495592	total: 1.95s	remaining: 9.15s
30:	learn: 0.0484119	total: 2.03s	remaining: 9.16s
31:	learn: 0.0477157	total: 2.1s	remaining: 9.11s
32:	learn: 0.0468580	total: 2.16s	remaining: 9.03s
33:	learn: 0.0457816	total: 2.22s	remaining: 8.95s
34:	learn: 0.0452177	total: 2.3s	remaining: 8.95s
35:	learn: 0.0440090	total: 2.37s	remaining: 8.88s
36:	learn: 0.0431964	total: 2.43s	remaining: 8.81s
37:	learn: 0.0426756	total: 2.51s	remaining: 8.77s
38:	learn: 0.0417218	total: 2.58s	remaining: 8.73s
39:	learn: 0.0412948	total: 2.64s	remaining: 8.64s
40:	learn: 0.0403424	total: 2.77s	remaining: 8.78s
41:	learn: 0.0396152	total: 2.88s	remaining: 8.84s
42:	learn: 0.0392340	total: 3s	remaining: 8.93s
43:	learn: 0.0384862	total: 3.13s	remaining: 9.03s
44:	learn: 0.0378270	total: 3.25s	remaining: 9.1s
45:	learn: 0.0372707	total: 3.38s	remaining: 9.18s
46:	learn: 0.0366945	total: 3.51s	remaining: 9.27s
47:	learn: 0.0360785	total: 3.63s	remaining: 9.31s
48:	learn: 0.0355806	total: 3.77s	remaining: 9.38s
49:	learn: 0.0353036	total: 3.89s	remaining: 9.42s
50:	learn: 0.0348807	total: 4s	remaining: 9.43s
51:	learn: 0.0344505	total: 4.12s	remaining: 9.44s
52:	learn: 0.0341293	total: 4.25s	remaining: 9.45s
53:	learn: 0.0337726	total: 4.36s	remaining: 9.46s
54:	learn: 0.0335348	total: 4.48s	remaining: 9.45s
55:	learn: 0.0331611	total: 4.59s	remaining: 9.42s
56:	learn: 0.0323209	total: 4.71s	remaining: 9.42s
57:	learn: 0.0317814	total: 4.83s	remaining: 9.41s
58:	learn: 0.0312963	total: 4.95s	remaining: 9.4s
59:	learn: 0.0310015	total: 5.07s	remaining: 9.38s
60:	learn: 0.0305051	total: 5.18s	remaining: 9.34s
61:	learn: 0.0301765	total: 5.26s	remaining: 9.25s
62:	learn: 0.0298746	total: 5.36s	remaining: 9.18s
63:	learn: 0.0294516	total: 5.48s	remaining: 9.17s
64:	learn: 0.0291754	total: 5.6s	remaining: 9.13s
65:	learn: 0.0288146	total: 5.74s	remaining: 9.13s
66:	learn: 0.0285189	total: 5.87s	remaining: 9.12s
67:	learn: 0.0282886	total: 6.01s	remaining: 9.1s
68:	learn: 0.0280363	total: 6.11s	remaining: 9.03s
69:	learn: 0.0276622	total: 6.24s	remaining: 9.01s
70:	learn: 0.0274116	total: 6.37s	remaining: 8.98s
71:	learn: 0.0271584	total: 6.5s	remaining: 8.94s
72:	learn: 0.0265930	total: 6.63s	remaining: 8.9s
73:	learn: 0.0263489	total: 6.78s	remaining: 8.88s
74:	learn: 0.0261395	total: 6.9s	remaining: 8.84s
75:	learn: 0.0258908	total: 7.01s	remaining: 8.77s
76:	learn: 0.0255213	total: 7.14s	remaining: 8.72s
77:	learn: 0.0252626	total: 7.28s	remaining: 8.68s
78:	learn: 0.0248759	total: 7.4s	remaining: 8.61s
79:	learn: 0.0247881	total: 7.54s	remaining: 8.57s
80:	learn: 0.0246529	total: 7.64s	remaining: 8.49s
81:	learn: 0.0244962	total: 7.76s	remaining: 8.42s
82:	learn: 0.0242258	total: 7.88s	remaining: 8.35s
83:	learn: 0.0239222	total: 8s	remaining: 8.28s
84:	learn: 0.0235478	total: 8.11s	remaining: 8.21s
85:	learn: 0.0233139	total: 8.24s	remaining: 8.14s
86:	learn: 0.0230936	total: 8.36s	remaining: 8.07s
87:	learn: 0.0229508	total: 8.42s	remaining: 7.95s
88:	learn: 0.0225930	total: 8.5s	remaining: 7.83s
89:	learn: 0.0223911	total: 8.56s	remaining: 7.71s
90:	learn: 0.0221352	total: 8.62s	remaining: 7.58s
91:	learn: 0.0219717	total: 8.68s	remaining: 7.45s
92:	learn: 0.0217455	total: 8.74s	remaining: 7.33s
93:	learn: 0.0214898	total: 8.8s	remaining: 7.21s
94:	learn: 0.0212515	total: 8.87s	remaining: 7.1s
95:	learn: 0.0208993	total: 8.93s	remaining: 6.98s
96:	learn: 0.0207322	total: 9.01s	remaining: 6.87s
97:	learn: 0.0205988	total: 9.07s	remaining: 6.75s
98:	learn: 0.0204576	total: 9.13s	remaining: 6.64s
99:	learn: 0.0203018	total: 9.2s	remaining: 6.53s
100:	learn: 0.0201908	total: 9.26s	remaining: 6.42s
101:	learn: 0.0200733	total: 9.32s	remaining: 6.3s
102:	learn: 0.0199296	total: 9.39s	remaining: 6.2s
103:	learn: 0.0198157	total: 9.45s	remaining: 6.09s
104:	learn: 0.0197064	total: 9.52s	remaining: 5.98s
105:	learn: 0.0194912	total: 9.58s	remaining: 5.87s
106:	learn: 0.0193906	total: 9.64s	remaining: 5.76s
107:	learn: 0.0188821	total: 9.71s	remaining: 5.66s
108:	learn: 0.0186513	total: 9.78s	remaining: 5.56s
109:	learn: 0.0184512	total: 9.84s	remaining: 5.46s
110:	learn: 0.0182936	total: 9.9s	remaining: 5.35s
111:	learn: 0.0180986	total: 9.96s	remaining: 5.25s
112:	learn: 0.0179863	total: 10s	remaining: 5.15s
113:	learn: 0.0179166	total: 10.1s	remaining: 5.05s
114:	learn: 0.0175764	total: 10.2s	remaining: 4.94s
115:	learn: 0.0173963	total: 10.2s	remaining: 4.84s
116:	learn: 0.0173102	total: 10.3s	remaining: 4.74s
117:	learn: 0.0171620	total: 10.3s	remaining: 4.64s
118:	learn: 0.0170148	total: 10.4s	remaining: 4.54s
119:	learn: 0.0168326	total: 10.5s	remaining: 4.45s
120:	learn: 0.0166459	total: 10.5s	remaining: 4.35s
121:	learn: 0.0165079	total: 10.6s	remaining: 4.26s
122:	learn: 0.0163585	total: 10.7s	remaining: 4.16s
123:	learn: 0.0162100	total: 10.7s	remaining: 4.06s
124:	learn: 0.0161483	total: 10.8s	remaining: 3.97s
125:	learn: 0.0160153	total: 10.8s	remaining: 3.87s
126:	learn: 0.0159211	total: 10.9s	remaining: 3.78s
127:	learn: 0.0158190	total: 11s	remaining: 3.68s
128:	learn: 0.0157023	total: 11s	remaining: 3.6s
129:	learn: 0.0155892	total: 11.1s	remaining: 3.5s
130:	learn: 0.0154955	total: 11.2s	remaining: 3.41s
131:	learn: 0.0153742	total: 11.2s	remaining: 3.32s
132:	learn: 0.0153179	total: 11.3s	remaining: 3.23s
133:	learn: 0.0152009	total: 11.4s	remaining: 3.14s
134:	learn: 0.0150889	total: 11.5s	remaining: 3.05s
135:	learn: 0.0150090	total: 11.5s	remaining: 2.96s
136:	learn: 0.0148727	total: 11.6s	remaining: 2.87s
137:	learn: 0.0147448	total: 11.6s	remaining: 2.79s
138:	learn: 0.0145831	total: 11.7s	remaining: 2.69s
139:	learn: 0.0144121	total: 11.8s	remaining: 2.61s
140:	learn: 0.0143510	total: 11.8s	remaining: 2.52s
141:	learn: 0.0142615	total: 11.9s	remaining: 2.43s
142:	learn: 0.0141697	total: 11.9s	remaining: 2.34s
143:	learn: 0.0139741	total: 12s	remaining: 2.25s
144:	learn: 0.0138703	total: 12.1s	remaining: 2.17s
145:	learn: 0.0137544	total: 12.2s	remaining: 2.08s
146:	learn: 0.0136741	total: 12.2s	remaining: 2s
147:	learn: 0.0136104	total: 12.3s	remaining: 1.91s
148:	learn: 0.0135395	total: 12.4s	remaining: 1.82s
149:	learn: 0.0134370	total: 12.4s	remaining: 1.74s
150:	learn: 0.0133521	total: 12.5s	remaining: 1.65s
151:	learn: 0.0132885	total: 12.6s	remaining: 1.57s
152:	learn: 0.0132077	total: 12.6s	remaining: 1.48s
153:	learn: 0.0130946	total: 12.7s	remaining: 1.4s
154:	learn: 0.0129974	total: 12.7s	remaining: 1.31s
155:	learn: 0.0129107	total: 12.8s	remaining: 1.23s
156:	learn: 0.0128065	total: 12.9s	remaining: 1.15s
157:	learn: 0.0127356	total: 12.9s	remaining: 1.06s
158:	learn: 0.0126458	total: 13s	remaining: 981ms
159:	learn: 0.0125688	total: 13.1s	remaining: 899ms
160:	learn: 0.0124923	total: 13.1s	remaining: 816ms
161:	learn: 0.0123420	total: 13.2s	remaining: 733ms
162:	learn: 0.0122952	total: 13.3s	remaining: 651ms
163:	learn: 0.0122395	total: 13.3s	remaining: 569ms
164:	learn: 0.0121335	total: 13.4s	remaining: 487ms
165:	learn: 0.0120671	total: 13.5s	remaining: 406ms
166:	learn: 0.0119558	total: 13.6s	remaining: 325ms
167:	learn: 0.0118547	total: 13.6s	remaining: 243ms
168:	learn: 0.0117018	total: 13.7s	remaining: 162ms
169:	learn: 0.0116058	total: 13.7s	remaining: 80.9ms
170:	learn: 0.0115463	total: 13.8s	remaining: 0us
[I 2024-12-19 14:19:00,347] Trial 10 finished with value: 69.21627930577772 and parameters: {'learning_rate': 0.09459749516165813, 'max_depth': 3, 'n_estimators': 171, 'scale_pos_weight': 10.84733886433144}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.29
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.61
 - F1-Score_Train: 98.62
 - Precision_Test: 5.10
 - Recall_Test: 90.48
 - AUPRC_Test: 71.77
 - Accuracy_Test: 97.15
 - F1-Score_Test: 9.66
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 171
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.85
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 69.2163

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5473609	total: 91.4ms	remaining: 16.2s
1:	learn: 0.4021043	total: 185ms	remaining: 16.3s
2:	learn: 0.3212695	total: 280ms	remaining: 16.3s
3:	learn: 0.2633772	total: 369ms	remaining: 16.1s
4:	learn: 0.2032855	total: 463ms	remaining: 16s
5:	learn: 0.1563412	total: 559ms	remaining: 16s
6:	learn: 0.1300931	total: 680ms	remaining: 16.6s
7:	learn: 0.1113860	total: 769ms	remaining: 16.3s
8:	learn: 0.0956070	total: 864ms	remaining: 16.2s
9:	learn: 0.0803797	total: 974ms	remaining: 16.4s
10:	learn: 0.0740219	total: 1.07s	remaining: 16.3s
11:	learn: 0.0647784	total: 1.17s	remaining: 16.2s
12:	learn: 0.0575877	total: 1.29s	remaining: 16.4s
13:	learn: 0.0521062	total: 1.38s	remaining: 16.2s
14:	learn: 0.0475620	total: 1.48s	remaining: 16.1s
15:	learn: 0.0443756	total: 1.58s	remaining: 16s
16:	learn: 0.0421641	total: 1.69s	remaining: 16s
17:	learn: 0.0399226	total: 1.78s	remaining: 15.8s
18:	learn: 0.0377195	total: 1.88s	remaining: 15.7s
19:	learn: 0.0359880	total: 1.97s	remaining: 15.5s
20:	learn: 0.0345149	total: 2.04s	remaining: 15.3s
21:	learn: 0.0330424	total: 2.15s	remaining: 15.2s
22:	learn: 0.0318188	total: 2.24s	remaining: 15.1s
23:	learn: 0.0305048	total: 2.33s	remaining: 14.9s
24:	learn: 0.0295351	total: 2.42s	remaining: 14.8s
25:	learn: 0.0288362	total: 2.54s	remaining: 14.9s
26:	learn: 0.0278095	total: 2.69s	remaining: 15s
27:	learn: 0.0264507	total: 2.84s	remaining: 15.2s
28:	learn: 0.0255891	total: 3s	remaining: 15.4s
29:	learn: 0.0247197	total: 3.17s	remaining: 15.6s
30:	learn: 0.0240845	total: 3.35s	remaining: 15.9s
31:	learn: 0.0232176	total: 3.51s	remaining: 16s
32:	learn: 0.0227025	total: 3.7s	remaining: 16.3s
33:	learn: 0.0220007	total: 3.88s	remaining: 16.4s
34:	learn: 0.0213112	total: 4.06s	remaining: 16.6s
35:	learn: 0.0206294	total: 4.21s	remaining: 16.6s
36:	learn: 0.0201521	total: 4.36s	remaining: 16.6s
37:	learn: 0.0194893	total: 4.52s	remaining: 16.7s
38:	learn: 0.0188366	total: 4.68s	remaining: 16.7s
39:	learn: 0.0184848	total: 4.85s	remaining: 16.7s
40:	learn: 0.0180412	total: 5.04s	remaining: 16.8s
41:	learn: 0.0175459	total: 5.2s	remaining: 16.8s
42:	learn: 0.0172857	total: 5.37s	remaining: 16.9s
43:	learn: 0.0169453	total: 5.54s	remaining: 16.9s
44:	learn: 0.0164494	total: 5.73s	remaining: 16.9s
45:	learn: 0.0161123	total: 5.89s	remaining: 16.9s
46:	learn: 0.0157863	total: 6.09s	remaining: 17s
47:	learn: 0.0155076	total: 6.25s	remaining: 16.9s
48:	learn: 0.0151159	total: 6.42s	remaining: 16.9s
49:	learn: 0.0148006	total: 6.59s	remaining: 16.9s
50:	learn: 0.0143283	total: 6.78s	remaining: 16.9s
51:	learn: 0.0140716	total: 6.96s	remaining: 16.9s
52:	learn: 0.0138775	total: 7.13s	remaining: 16.8s
53:	learn: 0.0136284	total: 7.3s	remaining: 16.8s
54:	learn: 0.0132814	total: 7.48s	remaining: 16.7s
55:	learn: 0.0130249	total: 7.65s	remaining: 16.7s
56:	learn: 0.0128052	total: 7.83s	remaining: 16.6s
57:	learn: 0.0125134	total: 7.99s	remaining: 16.5s
58:	learn: 0.0121953	total: 8.11s	remaining: 16.4s
59:	learn: 0.0119438	total: 8.2s	remaining: 16.1s
60:	learn: 0.0116844	total: 8.3s	remaining: 15.9s
61:	learn: 0.0114059	total: 8.39s	remaining: 15.7s
62:	learn: 0.0111617	total: 8.49s	remaining: 15.5s
63:	learn: 0.0109972	total: 8.57s	remaining: 15.3s
64:	learn: 0.0108117	total: 8.66s	remaining: 15.1s
65:	learn: 0.0105848	total: 8.74s	remaining: 14.8s
66:	learn: 0.0103148	total: 8.84s	remaining: 14.6s
67:	learn: 0.0100915	total: 8.94s	remaining: 14.5s
68:	learn: 0.0099332	total: 9.03s	remaining: 14.3s
69:	learn: 0.0098219	total: 9.14s	remaining: 14.1s
70:	learn: 0.0096780	total: 9.23s	remaining: 13.9s
71:	learn: 0.0095345	total: 9.32s	remaining: 13.7s
72:	learn: 0.0093663	total: 9.41s	remaining: 13.5s
73:	learn: 0.0092060	total: 9.51s	remaining: 13.4s
74:	learn: 0.0091342	total: 9.59s	remaining: 13.2s
75:	learn: 0.0090307	total: 9.67s	remaining: 13s
76:	learn: 0.0088942	total: 9.78s	remaining: 12.8s
77:	learn: 0.0087687	total: 9.86s	remaining: 12.6s
78:	learn: 0.0086494	total: 9.95s	remaining: 12.5s
79:	learn: 0.0085240	total: 10s	remaining: 12.3s
80:	learn: 0.0084299	total: 10.2s	remaining: 12.2s
81:	learn: 0.0083157	total: 10.2s	remaining: 12s
82:	learn: 0.0082111	total: 10.3s	remaining: 11.8s
83:	learn: 0.0080607	total: 10.4s	remaining: 11.7s
84:	learn: 0.0079986	total: 10.5s	remaining: 11.5s
85:	learn: 0.0079286	total: 10.6s	remaining: 11.4s
86:	learn: 0.0078094	total: 10.7s	remaining: 11.2s
87:	learn: 0.0077194	total: 10.8s	remaining: 11s
88:	learn: 0.0076335	total: 10.9s	remaining: 10.9s
89:	learn: 0.0075333	total: 11s	remaining: 10.7s
90:	learn: 0.0073368	total: 11.1s	remaining: 10.6s
91:	learn: 0.0072266	total: 11.2s	remaining: 10.5s
92:	learn: 0.0071136	total: 11.3s	remaining: 10.3s
93:	learn: 0.0070618	total: 11.4s	remaining: 10.2s
94:	learn: 0.0069224	total: 11.5s	remaining: 10s
95:	learn: 0.0068456	total: 11.6s	remaining: 9.89s
96:	learn: 0.0067717	total: 11.7s	remaining: 9.74s
97:	learn: 0.0067050	total: 11.8s	remaining: 9.6s
98:	learn: 0.0066658	total: 11.8s	remaining: 9.45s
99:	learn: 0.0065613	total: 11.9s	remaining: 9.31s
100:	learn: 0.0064964	total: 12s	remaining: 9.18s
101:	learn: 0.0064023	total: 12.1s	remaining: 9.04s
102:	learn: 0.0062869	total: 12.2s	remaining: 8.91s
103:	learn: 0.0062084	total: 12.3s	remaining: 8.78s
104:	learn: 0.0061482	total: 12.4s	remaining: 8.63s
105:	learn: 0.0060865	total: 12.5s	remaining: 8.49s
106:	learn: 0.0059437	total: 12.6s	remaining: 8.37s
107:	learn: 0.0058607	total: 12.7s	remaining: 8.24s
108:	learn: 0.0058246	total: 12.8s	remaining: 8.1s
109:	learn: 0.0057812	total: 12.9s	remaining: 7.97s
110:	learn: 0.0057016	total: 13s	remaining: 7.84s
111:	learn: 0.0056591	total: 13.1s	remaining: 7.74s
112:	learn: 0.0056159	total: 13.3s	remaining: 7.64s
113:	learn: 0.0054646	total: 13.5s	remaining: 7.57s
114:	learn: 0.0053892	total: 13.6s	remaining: 7.48s
115:	learn: 0.0053198	total: 13.8s	remaining: 7.38s
116:	learn: 0.0052788	total: 14s	remaining: 7.29s
117:	learn: 0.0052070	total: 14.2s	remaining: 7.2s
118:	learn: 0.0050749	total: 14.4s	remaining: 7.12s
119:	learn: 0.0050180	total: 14.5s	remaining: 7.03s
120:	learn: 0.0049649	total: 14.7s	remaining: 6.92s
121:	learn: 0.0049254	total: 14.9s	remaining: 6.82s
122:	learn: 0.0048996	total: 15s	remaining: 6.7s
123:	learn: 0.0047852	total: 15.2s	remaining: 6.61s
124:	learn: 0.0047506	total: 15.3s	remaining: 6.51s
125:	learn: 0.0046961	total: 15.5s	remaining: 6.41s
126:	learn: 0.0046376	total: 15.7s	remaining: 6.29s
127:	learn: 0.0045900	total: 15.8s	remaining: 6.18s
128:	learn: 0.0045453	total: 16s	remaining: 6.08s
129:	learn: 0.0045168	total: 16.2s	remaining: 5.98s
130:	learn: 0.0044838	total: 16.4s	remaining: 5.88s
131:	learn: 0.0044716	total: 16.5s	remaining: 5.76s
132:	learn: 0.0044232	total: 16.7s	remaining: 5.66s
133:	learn: 0.0043898	total: 16.9s	remaining: 5.54s
134:	learn: 0.0043531	total: 17s	remaining: 5.42s
135:	learn: 0.0042950	total: 17.2s	remaining: 5.32s
136:	learn: 0.0042586	total: 17.4s	remaining: 5.2s
137:	learn: 0.0042294	total: 17.5s	remaining: 5.09s
138:	learn: 0.0041804	total: 17.7s	remaining: 4.97s
139:	learn: 0.0041067	total: 17.9s	remaining: 4.86s
140:	learn: 0.0040570	total: 18.1s	remaining: 4.74s
141:	learn: 0.0040232	total: 18.3s	remaining: 4.64s
142:	learn: 0.0039870	total: 18.5s	remaining: 4.52s
143:	learn: 0.0039267	total: 18.7s	remaining: 4.41s
144:	learn: 0.0038714	total: 18.9s	remaining: 4.3s
145:	learn: 0.0038442	total: 19.1s	remaining: 4.18s
146:	learn: 0.0038139	total: 19.2s	remaining: 4.06s
147:	learn: 0.0037705	total: 19.4s	remaining: 3.94s
148:	learn: 0.0037310	total: 19.6s	remaining: 3.81s
149:	learn: 0.0036923	total: 19.8s	remaining: 3.69s
150:	learn: 0.0036832	total: 19.9s	remaining: 3.56s
151:	learn: 0.0036355	total: 20.1s	remaining: 3.44s
152:	learn: 0.0035995	total: 20.2s	remaining: 3.31s
153:	learn: 0.0035679	total: 20.4s	remaining: 3.18s
154:	learn: 0.0035310	total: 20.6s	remaining: 3.05s
155:	learn: 0.0034805	total: 20.8s	remaining: 2.93s
156:	learn: 0.0034462	total: 20.9s	remaining: 2.8s
157:	learn: 0.0034101	total: 21.1s	remaining: 2.67s
158:	learn: 0.0033836	total: 21.3s	remaining: 2.55s
159:	learn: 0.0033490	total: 21.5s	remaining: 2.42s
160:	learn: 0.0033175	total: 21.7s	remaining: 2.29s
161:	learn: 0.0033124	total: 21.8s	remaining: 2.16s
162:	learn: 0.0032623	total: 22s	remaining: 2.03s
163:	learn: 0.0032623	total: 22.2s	remaining: 1.89s
164:	learn: 0.0032266	total: 22.3s	remaining: 1.76s
165:	learn: 0.0031954	total: 22.5s	remaining: 1.63s
166:	learn: 0.0031855	total: 22.6s	remaining: 1.49s
167:	learn: 0.0031674	total: 22.8s	remaining: 1.36s
168:	learn: 0.0031529	total: 23s	remaining: 1.22s
169:	learn: 0.0031337	total: 23.1s	remaining: 1.09s
170:	learn: 0.0031117	total: 23.3s	remaining: 955ms
171:	learn: 0.0030950	total: 23.5s	remaining: 819ms
172:	learn: 0.0030678	total: 23.7s	remaining: 684ms
173:	learn: 0.0030678	total: 23.8s	remaining: 547ms
174:	learn: 0.0030422	total: 23.9s	remaining: 411ms
175:	learn: 0.0030232	total: 24s	remaining: 273ms
176:	learn: 0.0029867	total: 24.1s	remaining: 136ms
177:	learn: 0.0029515	total: 24.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.14
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.57
 - F1-Score_Train: 99.57
 - Precision_Test: 12.94
 - Recall_Test: 87.30
 - AUPRC_Test: 78.40
 - Accuracy_Test: 98.99
 - F1-Score_Test: 22.54
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.93
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5402949	total: 87.1ms	remaining: 15.4s
1:	learn: 0.4244271	total: 175ms	remaining: 15.4s
2:	learn: 0.3413336	total: 265ms	remaining: 15.4s
3:	learn: 0.2678942	total: 380ms	remaining: 16.5s
4:	learn: 0.2245185	total: 467ms	remaining: 16.2s
5:	learn: 0.1868235	total: 553ms	remaining: 15.9s
6:	learn: 0.1674218	total: 670ms	remaining: 16.4s
7:	learn: 0.1510248	total: 757ms	remaining: 16.1s
8:	learn: 0.1320272	total: 846ms	remaining: 15.9s
9:	learn: 0.1218026	total: 937ms	remaining: 15.7s
10:	learn: 0.1090687	total: 1.03s	remaining: 15.7s
11:	learn: 0.1007860	total: 1.13s	remaining: 15.6s
12:	learn: 0.0940192	total: 1.23s	remaining: 15.6s
13:	learn: 0.0834571	total: 1.32s	remaining: 15.5s
14:	learn: 0.0766258	total: 1.41s	remaining: 15.4s
15:	learn: 0.0712404	total: 1.51s	remaining: 15.3s
16:	learn: 0.0652020	total: 1.6s	remaining: 15.1s
17:	learn: 0.0607699	total: 1.71s	remaining: 15.2s
18:	learn: 0.0586320	total: 1.82s	remaining: 15.2s
19:	learn: 0.0559831	total: 1.91s	remaining: 15.1s
20:	learn: 0.0534733	total: 2s	remaining: 15s
21:	learn: 0.0511103	total: 2.11s	remaining: 15s
22:	learn: 0.0485084	total: 2.2s	remaining: 14.9s
23:	learn: 0.0471840	total: 2.29s	remaining: 14.7s
24:	learn: 0.0456212	total: 2.38s	remaining: 14.6s
25:	learn: 0.0444606	total: 2.47s	remaining: 14.4s
26:	learn: 0.0424292	total: 2.56s	remaining: 14.3s
27:	learn: 0.0410480	total: 2.67s	remaining: 14.3s
28:	learn: 0.0396534	total: 2.77s	remaining: 14.2s
29:	learn: 0.0380796	total: 2.86s	remaining: 14.1s
30:	learn: 0.0371739	total: 2.95s	remaining: 14s
31:	learn: 0.0360960	total: 3.05s	remaining: 13.9s
32:	learn: 0.0349497	total: 3.14s	remaining: 13.8s
33:	learn: 0.0336002	total: 3.24s	remaining: 13.7s
34:	learn: 0.0328964	total: 3.33s	remaining: 13.6s
35:	learn: 0.0316179	total: 3.42s	remaining: 13.5s
36:	learn: 0.0309779	total: 3.52s	remaining: 13.4s
37:	learn: 0.0300349	total: 3.6s	remaining: 13.3s
38:	learn: 0.0292459	total: 3.7s	remaining: 13.2s
39:	learn: 0.0283387	total: 3.82s	remaining: 13.2s
40:	learn: 0.0276273	total: 3.91s	remaining: 13.1s
41:	learn: 0.0272461	total: 3.99s	remaining: 12.9s
42:	learn: 0.0262916	total: 4.1s	remaining: 12.9s
43:	learn: 0.0252530	total: 4.18s	remaining: 12.7s
44:	learn: 0.0247549	total: 4.27s	remaining: 12.6s
45:	learn: 0.0243433	total: 4.37s	remaining: 12.5s
46:	learn: 0.0239942	total: 4.45s	remaining: 12.4s
47:	learn: 0.0231483	total: 4.54s	remaining: 12.3s
48:	learn: 0.0226150	total: 4.64s	remaining: 12.2s
49:	learn: 0.0221341	total: 4.73s	remaining: 12.1s
50:	learn: 0.0214662	total: 4.83s	remaining: 12s
51:	learn: 0.0208639	total: 4.93s	remaining: 12s
52:	learn: 0.0203632	total: 5.02s	remaining: 11.8s
53:	learn: 0.0200005	total: 5.1s	remaining: 11.7s
54:	learn: 0.0197433	total: 5.2s	remaining: 11.6s
55:	learn: 0.0192817	total: 5.29s	remaining: 11.5s
56:	learn: 0.0188259	total: 5.39s	remaining: 11.4s
57:	learn: 0.0184441	total: 5.49s	remaining: 11.4s
58:	learn: 0.0181533	total: 5.57s	remaining: 11.2s
59:	learn: 0.0176245	total: 5.66s	remaining: 11.1s
60:	learn: 0.0173431	total: 5.76s	remaining: 11.1s
61:	learn: 0.0171021	total: 5.87s	remaining: 11s
62:	learn: 0.0168752	total: 5.94s	remaining: 10.9s
63:	learn: 0.0164917	total: 6.05s	remaining: 10.8s
64:	learn: 0.0161985	total: 6.14s	remaining: 10.7s
65:	learn: 0.0159763	total: 6.22s	remaining: 10.6s
66:	learn: 0.0157335	total: 6.31s	remaining: 10.5s
67:	learn: 0.0155358	total: 6.4s	remaining: 10.4s
68:	learn: 0.0151612	total: 6.49s	remaining: 10.3s
69:	learn: 0.0148972	total: 6.59s	remaining: 10.2s
70:	learn: 0.0146202	total: 6.69s	remaining: 10.1s
71:	learn: 0.0143639	total: 6.77s	remaining: 9.97s
72:	learn: 0.0140655	total: 6.89s	remaining: 9.91s
73:	learn: 0.0138556	total: 6.98s	remaining: 9.81s
74:	learn: 0.0136685	total: 7.07s	remaining: 9.7s
75:	learn: 0.0135174	total: 7.16s	remaining: 9.61s
76:	learn: 0.0132520	total: 7.25s	remaining: 9.52s
77:	learn: 0.0130081	total: 7.34s	remaining: 9.41s
78:	learn: 0.0128101	total: 7.43s	remaining: 9.31s
79:	learn: 0.0126113	total: 7.52s	remaining: 9.21s
80:	learn: 0.0124336	total: 7.62s	remaining: 9.13s
81:	learn: 0.0122210	total: 7.78s	remaining: 9.11s
82:	learn: 0.0121119	total: 7.93s	remaining: 9.08s
83:	learn: 0.0119842	total: 8.1s	remaining: 9.06s
84:	learn: 0.0117673	total: 8.25s	remaining: 9.03s
85:	learn: 0.0115829	total: 8.41s	remaining: 9s
86:	learn: 0.0114559	total: 8.58s	remaining: 8.97s
87:	learn: 0.0112806	total: 8.76s	remaining: 8.95s
88:	learn: 0.0111834	total: 8.93s	remaining: 8.93s
89:	learn: 0.0110666	total: 9.1s	remaining: 8.9s
90:	learn: 0.0108562	total: 9.28s	remaining: 8.87s
91:	learn: 0.0107334	total: 9.44s	remaining: 8.82s
92:	learn: 0.0106527	total: 9.58s	remaining: 8.76s
93:	learn: 0.0105210	total: 9.74s	remaining: 8.7s
94:	learn: 0.0103174	total: 9.92s	remaining: 8.66s
95:	learn: 0.0102171	total: 10.1s	remaining: 8.64s
96:	learn: 0.0100149	total: 10.3s	remaining: 8.59s
97:	learn: 0.0099107	total: 10.4s	remaining: 8.53s
98:	learn: 0.0097815	total: 10.6s	remaining: 8.47s
99:	learn: 0.0096490	total: 10.8s	remaining: 8.43s
100:	learn: 0.0095248	total: 11s	remaining: 8.38s
101:	learn: 0.0094409	total: 11.1s	remaining: 8.3s
102:	learn: 0.0093622	total: 11.3s	remaining: 8.24s
103:	learn: 0.0092709	total: 11.5s	remaining: 8.16s
104:	learn: 0.0091880	total: 11.6s	remaining: 8.09s
105:	learn: 0.0091117	total: 11.8s	remaining: 8.03s
106:	learn: 0.0090011	total: 12s	remaining: 7.95s
107:	learn: 0.0089618	total: 12.1s	remaining: 7.86s
108:	learn: 0.0087654	total: 12.3s	remaining: 7.76s
109:	learn: 0.0086378	total: 12.4s	remaining: 7.68s
110:	learn: 0.0085476	total: 12.6s	remaining: 7.6s
111:	learn: 0.0084393	total: 12.7s	remaining: 7.51s
112:	learn: 0.0083301	total: 12.8s	remaining: 7.38s
113:	learn: 0.0082531	total: 12.9s	remaining: 7.25s
114:	learn: 0.0081679	total: 13s	remaining: 7.13s
115:	learn: 0.0081284	total: 13.1s	remaining: 7s
116:	learn: 0.0080440	total: 13.2s	remaining: 6.88s
117:	learn: 0.0079467	total: 13.3s	remaining: 6.77s
118:	learn: 0.0078550	total: 13.4s	remaining: 6.64s
119:	learn: 0.0077519	total: 13.5s	remaining: 6.51s
120:	learn: 0.0076691	total: 13.6s	remaining: 6.39s
121:	learn: 0.0075550	total: 13.7s	remaining: 6.27s
122:	learn: 0.0074543	total: 13.8s	remaining: 6.15s
123:	learn: 0.0073507	total: 13.9s	remaining: 6.03s
124:	learn: 0.0072846	total: 13.9s	remaining: 5.91s
125:	learn: 0.0072245	total: 14s	remaining: 5.78s
126:	learn: 0.0071400	total: 14.1s	remaining: 5.67s
127:	learn: 0.0070600	total: 14.2s	remaining: 5.54s
128:	learn: 0.0069965	total: 14.3s	remaining: 5.43s
129:	learn: 0.0069111	total: 14.4s	remaining: 5.31s
130:	learn: 0.0068288	total: 14.5s	remaining: 5.19s
131:	learn: 0.0067699	total: 14.6s	remaining: 5.08s
132:	learn: 0.0067369	total: 14.7s	remaining: 4.96s
133:	learn: 0.0066253	total: 14.8s	remaining: 4.85s
134:	learn: 0.0065824	total: 14.9s	remaining: 4.73s
135:	learn: 0.0064739	total: 15s	remaining: 4.62s
136:	learn: 0.0064138	total: 15s	remaining: 4.5s
137:	learn: 0.0063618	total: 15.1s	remaining: 4.38s
138:	learn: 0.0062695	total: 15.2s	remaining: 4.27s
139:	learn: 0.0062117	total: 15.3s	remaining: 4.16s
140:	learn: 0.0061733	total: 15.4s	remaining: 4.05s
141:	learn: 0.0061081	total: 15.5s	remaining: 3.94s
142:	learn: 0.0060116	total: 15.6s	remaining: 3.82s
143:	learn: 0.0059506	total: 15.7s	remaining: 3.71s
144:	learn: 0.0058634	total: 15.8s	remaining: 3.6s
145:	learn: 0.0057960	total: 15.9s	remaining: 3.49s
146:	learn: 0.0057488	total: 16s	remaining: 3.37s
147:	learn: 0.0057217	total: 16.1s	remaining: 3.26s
148:	learn: 0.0056920	total: 16.2s	remaining: 3.14s
149:	learn: 0.0056661	total: 16.2s	remaining: 3.03s
150:	learn: 0.0056080	total: 16.4s	remaining: 2.92s
151:	learn: 0.0055820	total: 16.4s	remaining: 2.81s
152:	learn: 0.0055305	total: 16.5s	remaining: 2.7s
153:	learn: 0.0054807	total: 16.6s	remaining: 2.59s
154:	learn: 0.0054577	total: 16.7s	remaining: 2.48s
155:	learn: 0.0054397	total: 16.8s	remaining: 2.37s
156:	learn: 0.0054045	total: 16.9s	remaining: 2.25s
157:	learn: 0.0053530	total: 17s	remaining: 2.15s
158:	learn: 0.0053287	total: 17s	remaining: 2.04s
159:	learn: 0.0052666	total: 17.1s	remaining: 1.93s
160:	learn: 0.0052291	total: 17.2s	remaining: 1.82s
161:	learn: 0.0051908	total: 17.3s	remaining: 1.71s
162:	learn: 0.0051465	total: 17.4s	remaining: 1.6s
163:	learn: 0.0050604	total: 17.5s	remaining: 1.49s
164:	learn: 0.0050064	total: 17.6s	remaining: 1.39s
165:	learn: 0.0050065	total: 17.7s	remaining: 1.28s
166:	learn: 0.0049659	total: 17.8s	remaining: 1.17s
167:	learn: 0.0049294	total: 17.9s	remaining: 1.06s
168:	learn: 0.0049043	total: 18s	remaining: 956ms
169:	learn: 0.0048871	total: 18s	remaining: 849ms
170:	learn: 0.0048592	total: 18.1s	remaining: 742ms
171:	learn: 0.0048430	total: 18.2s	remaining: 636ms
172:	learn: 0.0047790	total: 18.3s	remaining: 529ms
173:	learn: 0.0047374	total: 18.4s	remaining: 423ms
174:	learn: 0.0046821	total: 18.5s	remaining: 317ms
175:	learn: 0.0046281	total: 18.6s	remaining: 211ms
176:	learn: 0.0045527	total: 18.7s	remaining: 106ms
177:	learn: 0.0045318	total: 18.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.77
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.38
 - F1-Score_Train: 99.38
 - Precision_Test: 10.41
 - Recall_Test: 89.68
 - AUPRC_Test: 68.85
 - Accuracy_Test: 98.69
 - F1-Score_Test: 18.66
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.93
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5459035	total: 82.1ms	remaining: 14.5s
1:	learn: 0.4004351	total: 177ms	remaining: 15.6s
2:	learn: 0.3128886	total: 273ms	remaining: 15.9s
3:	learn: 0.2517210	total: 374ms	remaining: 16.3s
4:	learn: 0.2013117	total: 471ms	remaining: 16.3s
5:	learn: 0.1618378	total: 563ms	remaining: 16.1s
6:	learn: 0.1341873	total: 664ms	remaining: 16.2s
7:	learn: 0.1209675	total: 764ms	remaining: 16.2s
8:	learn: 0.1031998	total: 907ms	remaining: 17s
9:	learn: 0.0944907	total: 996ms	remaining: 16.7s
10:	learn: 0.0851116	total: 1.07s	remaining: 16.3s
11:	learn: 0.0775483	total: 1.17s	remaining: 16.2s
12:	learn: 0.0694434	total: 1.26s	remaining: 16s
13:	learn: 0.0654280	total: 1.35s	remaining: 15.8s
14:	learn: 0.0615651	total: 1.45s	remaining: 15.8s
15:	learn: 0.0585145	total: 1.54s	remaining: 15.6s
16:	learn: 0.0547252	total: 1.63s	remaining: 15.5s
17:	learn: 0.0514736	total: 1.76s	remaining: 15.6s
18:	learn: 0.0485073	total: 1.85s	remaining: 15.5s
19:	learn: 0.0454709	total: 2s	remaining: 15.8s
20:	learn: 0.0428807	total: 2.16s	remaining: 16.1s
21:	learn: 0.0407952	total: 2.34s	remaining: 16.6s
22:	learn: 0.0387115	total: 2.51s	remaining: 16.9s
23:	learn: 0.0371718	total: 2.7s	remaining: 17.3s
24:	learn: 0.0360274	total: 2.85s	remaining: 17.4s
25:	learn: 0.0347973	total: 3.02s	remaining: 17.7s
26:	learn: 0.0337102	total: 3.21s	remaining: 18s
27:	learn: 0.0324151	total: 3.39s	remaining: 18.2s
28:	learn: 0.0312301	total: 3.56s	remaining: 18.3s
29:	learn: 0.0300632	total: 3.74s	remaining: 18.4s
30:	learn: 0.0288441	total: 3.92s	remaining: 18.6s
31:	learn: 0.0278325	total: 4.11s	remaining: 18.7s
32:	learn: 0.0267834	total: 4.28s	remaining: 18.8s
33:	learn: 0.0261064	total: 4.46s	remaining: 18.9s
34:	learn: 0.0251248	total: 4.65s	remaining: 19s
35:	learn: 0.0244313	total: 4.83s	remaining: 19.1s
36:	learn: 0.0237594	total: 5.02s	remaining: 19.1s
37:	learn: 0.0230732	total: 5.21s	remaining: 19.2s
38:	learn: 0.0223663	total: 5.37s	remaining: 19.1s
39:	learn: 0.0216198	total: 5.55s	remaining: 19.1s
40:	learn: 0.0211320	total: 5.71s	remaining: 19.1s
41:	learn: 0.0205689	total: 5.89s	remaining: 19.1s
42:	learn: 0.0198823	total: 6.07s	remaining: 19.1s
43:	learn: 0.0195701	total: 6.26s	remaining: 19.1s
44:	learn: 0.0189480	total: 6.44s	remaining: 19s
45:	learn: 0.0184499	total: 6.63s	remaining: 19s
46:	learn: 0.0180931	total: 6.79s	remaining: 18.9s
47:	learn: 0.0177864	total: 6.96s	remaining: 18.9s
48:	learn: 0.0174929	total: 7.13s	remaining: 18.8s
49:	learn: 0.0171960	total: 7.26s	remaining: 18.6s
50:	learn: 0.0168389	total: 7.35s	remaining: 18.3s
51:	learn: 0.0164896	total: 7.44s	remaining: 18s
52:	learn: 0.0162254	total: 7.54s	remaining: 17.8s
53:	learn: 0.0159209	total: 7.63s	remaining: 17.5s
54:	learn: 0.0155983	total: 7.73s	remaining: 17.3s
55:	learn: 0.0153054	total: 7.84s	remaining: 17.1s
56:	learn: 0.0149914	total: 7.92s	remaining: 16.8s
57:	learn: 0.0147568	total: 8.01s	remaining: 16.6s
58:	learn: 0.0145546	total: 8.12s	remaining: 16.4s
59:	learn: 0.0143561	total: 8.21s	remaining: 16.1s
60:	learn: 0.0141971	total: 8.29s	remaining: 15.9s
61:	learn: 0.0140529	total: 8.38s	remaining: 15.7s
62:	learn: 0.0138550	total: 8.47s	remaining: 15.5s
63:	learn: 0.0136772	total: 8.55s	remaining: 15.2s
64:	learn: 0.0134623	total: 8.64s	remaining: 15s
65:	learn: 0.0133203	total: 8.72s	remaining: 14.8s
66:	learn: 0.0129899	total: 8.82s	remaining: 14.6s
67:	learn: 0.0128457	total: 8.91s	remaining: 14.4s
68:	learn: 0.0127128	total: 8.99s	remaining: 14.2s
69:	learn: 0.0125149	total: 9.09s	remaining: 14s
70:	learn: 0.0123535	total: 9.19s	remaining: 13.9s
71:	learn: 0.0120997	total: 9.29s	remaining: 13.7s
72:	learn: 0.0119305	total: 9.37s	remaining: 13.5s
73:	learn: 0.0118242	total: 9.46s	remaining: 13.3s
74:	learn: 0.0116746	total: 9.55s	remaining: 13.1s
75:	learn: 0.0114200	total: 9.64s	remaining: 12.9s
76:	learn: 0.0112446	total: 9.73s	remaining: 12.8s
77:	learn: 0.0109787	total: 9.83s	remaining: 12.6s
78:	learn: 0.0108530	total: 9.91s	remaining: 12.4s
79:	learn: 0.0107036	total: 10s	remaining: 12.3s
80:	learn: 0.0105259	total: 10.1s	remaining: 12.1s
81:	learn: 0.0103685	total: 10.2s	remaining: 11.9s
82:	learn: 0.0101562	total: 10.3s	remaining: 11.8s
83:	learn: 0.0099597	total: 10.4s	remaining: 11.6s
84:	learn: 0.0098717	total: 10.5s	remaining: 11.4s
85:	learn: 0.0097935	total: 10.6s	remaining: 11.3s
86:	learn: 0.0096040	total: 10.6s	remaining: 11.1s
87:	learn: 0.0095313	total: 10.7s	remaining: 11s
88:	learn: 0.0092927	total: 10.8s	remaining: 10.8s
89:	learn: 0.0091341	total: 10.9s	remaining: 10.7s
90:	learn: 0.0089592	total: 11s	remaining: 10.5s
91:	learn: 0.0088406	total: 11.1s	remaining: 10.4s
92:	learn: 0.0087062	total: 11.2s	remaining: 10.2s
93:	learn: 0.0085759	total: 11.3s	remaining: 10.1s
94:	learn: 0.0085322	total: 11.4s	remaining: 9.95s
95:	learn: 0.0083968	total: 11.5s	remaining: 9.81s
96:	learn: 0.0083445	total: 11.6s	remaining: 9.66s
97:	learn: 0.0081966	total: 11.7s	remaining: 9.53s
98:	learn: 0.0080598	total: 11.8s	remaining: 9.38s
99:	learn: 0.0079941	total: 11.9s	remaining: 9.24s
100:	learn: 0.0078930	total: 11.9s	remaining: 9.11s
101:	learn: 0.0078397	total: 12s	remaining: 8.96s
102:	learn: 0.0077309	total: 12.1s	remaining: 8.82s
103:	learn: 0.0076375	total: 12.2s	remaining: 8.71s
104:	learn: 0.0075924	total: 12.3s	remaining: 8.57s
105:	learn: 0.0074538	total: 12.4s	remaining: 8.43s
106:	learn: 0.0073980	total: 12.5s	remaining: 8.3s
107:	learn: 0.0072501	total: 12.6s	remaining: 8.17s
108:	learn: 0.0071514	total: 12.7s	remaining: 8.03s
109:	learn: 0.0070871	total: 12.8s	remaining: 7.91s
110:	learn: 0.0070506	total: 12.9s	remaining: 7.77s
111:	learn: 0.0070075	total: 12.9s	remaining: 7.63s
112:	learn: 0.0068934	total: 13.1s	remaining: 7.51s
113:	learn: 0.0067731	total: 13.1s	remaining: 7.38s
114:	learn: 0.0066730	total: 13.2s	remaining: 7.26s
115:	learn: 0.0066364	total: 13.3s	remaining: 7.13s
116:	learn: 0.0065684	total: 13.4s	remaining: 7s
117:	learn: 0.0065090	total: 13.5s	remaining: 6.87s
118:	learn: 0.0064557	total: 13.6s	remaining: 6.74s
119:	learn: 0.0063266	total: 13.7s	remaining: 6.62s
120:	learn: 0.0062811	total: 13.8s	remaining: 6.49s
121:	learn: 0.0062010	total: 13.9s	remaining: 6.37s
122:	learn: 0.0061337	total: 14s	remaining: 6.25s
123:	learn: 0.0060388	total: 14.1s	remaining: 6.12s
124:	learn: 0.0059765	total: 14.2s	remaining: 6s
125:	learn: 0.0059046	total: 14.2s	remaining: 5.88s
126:	learn: 0.0058532	total: 14.3s	remaining: 5.76s
127:	learn: 0.0057689	total: 14.4s	remaining: 5.64s
128:	learn: 0.0057300	total: 14.5s	remaining: 5.51s
129:	learn: 0.0056483	total: 14.6s	remaining: 5.39s
130:	learn: 0.0055911	total: 14.7s	remaining: 5.28s
131:	learn: 0.0055089	total: 14.8s	remaining: 5.16s
132:	learn: 0.0054595	total: 14.9s	remaining: 5.04s
133:	learn: 0.0053617	total: 15s	remaining: 4.92s
134:	learn: 0.0053125	total: 15.1s	remaining: 4.8s
135:	learn: 0.0052848	total: 15.1s	remaining: 4.68s
136:	learn: 0.0052492	total: 15.2s	remaining: 4.56s
137:	learn: 0.0051751	total: 15.3s	remaining: 4.45s
138:	learn: 0.0051110	total: 15.4s	remaining: 4.33s
139:	learn: 0.0050763	total: 15.5s	remaining: 4.21s
140:	learn: 0.0050097	total: 15.6s	remaining: 4.1s
141:	learn: 0.0049893	total: 15.7s	remaining: 3.98s
142:	learn: 0.0049044	total: 15.8s	remaining: 3.87s
143:	learn: 0.0048333	total: 15.9s	remaining: 3.75s
144:	learn: 0.0047737	total: 16s	remaining: 3.64s
145:	learn: 0.0047361	total: 16.1s	remaining: 3.52s
146:	learn: 0.0046888	total: 16.2s	remaining: 3.41s
147:	learn: 0.0046611	total: 16.2s	remaining: 3.29s
148:	learn: 0.0046067	total: 16.4s	remaining: 3.18s
149:	learn: 0.0045314	total: 16.4s	remaining: 3.07s
150:	learn: 0.0044696	total: 16.5s	remaining: 2.96s
151:	learn: 0.0044216	total: 16.6s	remaining: 2.85s
152:	learn: 0.0043904	total: 16.7s	remaining: 2.73s
153:	learn: 0.0043565	total: 16.8s	remaining: 2.62s
154:	learn: 0.0042940	total: 16.9s	remaining: 2.51s
155:	learn: 0.0042469	total: 17s	remaining: 2.4s
156:	learn: 0.0042091	total: 17.1s	remaining: 2.29s
157:	learn: 0.0041831	total: 17.2s	remaining: 2.18s
158:	learn: 0.0041365	total: 17.4s	remaining: 2.07s
159:	learn: 0.0041190	total: 17.5s	remaining: 1.97s
160:	learn: 0.0040802	total: 17.7s	remaining: 1.86s
161:	learn: 0.0040388	total: 17.9s	remaining: 1.76s
162:	learn: 0.0040304	total: 18s	remaining: 1.66s
163:	learn: 0.0039883	total: 18.2s	remaining: 1.55s
164:	learn: 0.0039496	total: 18.4s	remaining: 1.45s
165:	learn: 0.0039025	total: 18.6s	remaining: 1.34s
166:	learn: 0.0038811	total: 18.7s	remaining: 1.23s
167:	learn: 0.0038811	total: 18.9s	remaining: 1.12s
168:	learn: 0.0038459	total: 19s	remaining: 1.01s
169:	learn: 0.0038338	total: 19.2s	remaining: 902ms
170:	learn: 0.0038122	total: 19.3s	remaining: 792ms
171:	learn: 0.0037796	total: 19.5s	remaining: 680ms
172:	learn: 0.0037743	total: 19.7s	remaining: 568ms
173:	learn: 0.0037084	total: 19.8s	remaining: 456ms
174:	learn: 0.0036721	total: 20s	remaining: 343ms
175:	learn: 0.0036640	total: 20.2s	remaining: 229ms
176:	learn: 0.0036236	total: 20.4s	remaining: 115ms
177:	learn: 0.0035930	total: 20.5s	remaining: 0us
[I 2024-12-19 14:20:10,939] Trial 11 finished with value: 73.84096687604249 and parameters: {'learning_rate': 0.0564778558554291, 'max_depth': 6, 'n_estimators': 178, 'scale_pos_weight': 14.93460217793416}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.91
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.45
 - F1-Score_Train: 99.45
 - Precision_Test: 11.51
 - Recall_Test: 88.89
 - AUPRC_Test: 74.27
 - Accuracy_Test: 98.83
 - F1-Score_Test: 20.38
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 178
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.93
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 73.8410

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5307292	total: 92.2ms	remaining: 14.7s
1:	learn: 0.3750641	total: 187ms	remaining: 14.9s
2:	learn: 0.2922774	total: 280ms	remaining: 14.7s
3:	learn: 0.2208457	total: 397ms	remaining: 15.6s
4:	learn: 0.1796055	total: 495ms	remaining: 15.5s
5:	learn: 0.1540549	total: 576ms	remaining: 14.9s
6:	learn: 0.1234457	total: 678ms	remaining: 14.9s
7:	learn: 0.1081048	total: 763ms	remaining: 14.6s
8:	learn: 0.0926606	total: 858ms	remaining: 14.5s
9:	learn: 0.0767472	total: 959ms	remaining: 14.5s
10:	learn: 0.0653713	total: 1.06s	remaining: 14.5s
11:	learn: 0.0568825	total: 1.16s	remaining: 14.4s
12:	learn: 0.0518486	total: 1.26s	remaining: 14.3s
13:	learn: 0.0475577	total: 1.35s	remaining: 14.2s
14:	learn: 0.0433585	total: 1.47s	remaining: 14.3s
15:	learn: 0.0399595	total: 1.55s	remaining: 14.1s
16:	learn: 0.0370560	total: 1.66s	remaining: 14s
17:	learn: 0.0353674	total: 1.75s	remaining: 13.9s
18:	learn: 0.0339752	total: 1.84s	remaining: 13.8s
19:	learn: 0.0324668	total: 1.93s	remaining: 13.6s
20:	learn: 0.0306389	total: 2.03s	remaining: 13.6s
21:	learn: 0.0295766	total: 2.12s	remaining: 13.4s
22:	learn: 0.0277194	total: 2.21s	remaining: 13.3s
23:	learn: 0.0268213	total: 2.31s	remaining: 13.2s
24:	learn: 0.0259268	total: 2.4s	remaining: 13.1s
25:	learn: 0.0251823	total: 2.51s	remaining: 13s
26:	learn: 0.0239708	total: 2.61s	remaining: 13s
27:	learn: 0.0231391	total: 2.69s	remaining: 12.8s
28:	learn: 0.0221810	total: 2.79s	remaining: 12.7s
29:	learn: 0.0215090	total: 2.88s	remaining: 12.6s
30:	learn: 0.0209141	total: 2.97s	remaining: 12.4s
31:	learn: 0.0203297	total: 3.06s	remaining: 12.3s
32:	learn: 0.0198291	total: 3.16s	remaining: 12.3s
33:	learn: 0.0193089	total: 3.25s	remaining: 12.1s
34:	learn: 0.0187979	total: 3.33s	remaining: 12s
35:	learn: 0.0182750	total: 3.43s	remaining: 11.9s
36:	learn: 0.0177795	total: 3.54s	remaining: 11.9s
37:	learn: 0.0174281	total: 3.63s	remaining: 11.7s
38:	learn: 0.0170058	total: 3.72s	remaining: 11.6s
39:	learn: 0.0165486	total: 3.81s	remaining: 11.5s
40:	learn: 0.0159294	total: 3.9s	remaining: 11.4s
41:	learn: 0.0154454	total: 4s	remaining: 11.3s
42:	learn: 0.0151522	total: 4.09s	remaining: 11.2s
43:	learn: 0.0146906	total: 4.18s	remaining: 11.1s
44:	learn: 0.0143342	total: 4.28s	remaining: 11s
45:	learn: 0.0141255	total: 4.36s	remaining: 10.9s
46:	learn: 0.0138572	total: 4.45s	remaining: 10.8s
47:	learn: 0.0135581	total: 4.55s	remaining: 10.7s
48:	learn: 0.0130758	total: 4.65s	remaining: 10.6s
49:	learn: 0.0127209	total: 4.74s	remaining: 10.5s
50:	learn: 0.0124794	total: 4.83s	remaining: 10.4s
51:	learn: 0.0121468	total: 4.92s	remaining: 10.3s
52:	learn: 0.0118727	total: 5.01s	remaining: 10.2s
53:	learn: 0.0115732	total: 5.11s	remaining: 10.1s
54:	learn: 0.0112421	total: 5.2s	remaining: 10s
55:	learn: 0.0110235	total: 5.29s	remaining: 9.91s
56:	learn: 0.0108592	total: 5.39s	remaining: 9.83s
57:	learn: 0.0106562	total: 5.48s	remaining: 9.74s
58:	learn: 0.0104715	total: 5.58s	remaining: 9.65s
59:	learn: 0.0102731	total: 5.68s	remaining: 9.56s
60:	learn: 0.0100393	total: 5.77s	remaining: 9.46s
61:	learn: 0.0098554	total: 5.86s	remaining: 9.35s
62:	learn: 0.0096552	total: 5.96s	remaining: 9.26s
63:	learn: 0.0094656	total: 6.04s	remaining: 9.15s
64:	learn: 0.0093105	total: 6.12s	remaining: 9.04s
65:	learn: 0.0091202	total: 6.23s	remaining: 8.97s
66:	learn: 0.0089263	total: 6.32s	remaining: 8.86s
67:	learn: 0.0087879	total: 6.4s	remaining: 8.76s
68:	learn: 0.0085378	total: 6.51s	remaining: 8.68s
69:	learn: 0.0083977	total: 6.61s	remaining: 8.59s
70:	learn: 0.0082394	total: 6.7s	remaining: 8.49s
71:	learn: 0.0080589	total: 6.8s	remaining: 8.4s
72:	learn: 0.0079789	total: 6.88s	remaining: 8.29s
73:	learn: 0.0077670	total: 6.98s	remaining: 8.2s
74:	learn: 0.0076285	total: 7.08s	remaining: 8.12s
75:	learn: 0.0074433	total: 7.17s	remaining: 8.02s
76:	learn: 0.0073395	total: 7.25s	remaining: 7.91s
77:	learn: 0.0072309	total: 7.35s	remaining: 7.82s
78:	learn: 0.0071029	total: 7.44s	remaining: 7.72s
79:	learn: 0.0070218	total: 7.57s	remaining: 7.66s
80:	learn: 0.0069498	total: 7.67s	remaining: 7.58s
81:	learn: 0.0068708	total: 7.76s	remaining: 7.47s
82:	learn: 0.0068109	total: 7.85s	remaining: 7.38s
83:	learn: 0.0066764	total: 7.94s	remaining: 7.28s
84:	learn: 0.0065324	total: 8.03s	remaining: 7.18s
85:	learn: 0.0064561	total: 8.13s	remaining: 7.09s
86:	learn: 0.0063476	total: 8.22s	remaining: 7s
87:	learn: 0.0062510	total: 8.3s	remaining: 6.89s
88:	learn: 0.0062180	total: 8.41s	remaining: 6.8s
89:	learn: 0.0061519	total: 8.49s	remaining: 6.7s
90:	learn: 0.0060723	total: 8.58s	remaining: 6.6s
91:	learn: 0.0060130	total: 8.7s	remaining: 6.52s
92:	learn: 0.0058978	total: 8.82s	remaining: 6.45s
93:	learn: 0.0058428	total: 8.98s	remaining: 6.4s
94:	learn: 0.0057468	total: 9.14s	remaining: 6.35s
95:	learn: 0.0056566	total: 9.31s	remaining: 6.31s
96:	learn: 0.0055207	total: 9.48s	remaining: 6.26s
97:	learn: 0.0054616	total: 9.66s	remaining: 6.21s
98:	learn: 0.0053814	total: 9.85s	remaining: 6.17s
99:	learn: 0.0053303	total: 10s	remaining: 6.12s
100:	learn: 0.0052622	total: 10.2s	remaining: 6.07s
101:	learn: 0.0052213	total: 10.4s	remaining: 5.99s
102:	learn: 0.0051982	total: 10.5s	remaining: 5.92s
103:	learn: 0.0051442	total: 10.7s	remaining: 5.86s
104:	learn: 0.0050764	total: 10.9s	remaining: 5.79s
105:	learn: 0.0050337	total: 11s	remaining: 5.72s
106:	learn: 0.0049493	total: 11.2s	remaining: 5.66s
107:	learn: 0.0048791	total: 11.4s	remaining: 5.59s
108:	learn: 0.0048598	total: 11.5s	remaining: 5.51s
109:	learn: 0.0048332	total: 11.7s	remaining: 5.43s
110:	learn: 0.0047154	total: 11.9s	remaining: 5.36s
111:	learn: 0.0046416	total: 12.1s	remaining: 5.28s
112:	learn: 0.0045445	total: 12.3s	remaining: 5.21s
113:	learn: 0.0044904	total: 12.4s	remaining: 5.12s
114:	learn: 0.0044565	total: 12.6s	remaining: 5.02s
115:	learn: 0.0043905	total: 12.7s	remaining: 4.94s
116:	learn: 0.0043528	total: 12.9s	remaining: 4.85s
117:	learn: 0.0043021	total: 13.1s	remaining: 4.76s
118:	learn: 0.0042658	total: 13.3s	remaining: 4.68s
119:	learn: 0.0042369	total: 13.4s	remaining: 4.59s
120:	learn: 0.0041955	total: 13.6s	remaining: 4.5s
121:	learn: 0.0041664	total: 13.8s	remaining: 4.41s
122:	learn: 0.0041337	total: 13.9s	remaining: 4.31s
123:	learn: 0.0040806	total: 14.1s	remaining: 4.21s
124:	learn: 0.0040368	total: 14.2s	remaining: 4.1s
125:	learn: 0.0039896	total: 14.3s	remaining: 3.98s
126:	learn: 0.0039275	total: 14.4s	remaining: 3.87s
127:	learn: 0.0038324	total: 14.5s	remaining: 3.75s
128:	learn: 0.0037973	total: 14.6s	remaining: 3.63s
129:	learn: 0.0037651	total: 14.7s	remaining: 3.51s
130:	learn: 0.0037001	total: 14.8s	remaining: 3.39s
131:	learn: 0.0036563	total: 14.9s	remaining: 3.27s
132:	learn: 0.0036362	total: 15s	remaining: 3.16s
133:	learn: 0.0035993	total: 15.1s	remaining: 3.04s
134:	learn: 0.0035780	total: 15.2s	remaining: 2.92s
135:	learn: 0.0035535	total: 15.3s	remaining: 2.81s
136:	learn: 0.0035293	total: 15.4s	remaining: 2.69s
137:	learn: 0.0034737	total: 15.5s	remaining: 2.58s
138:	learn: 0.0034737	total: 15.5s	remaining: 2.46s
139:	learn: 0.0034133	total: 15.6s	remaining: 2.34s
140:	learn: 0.0033674	total: 15.7s	remaining: 2.23s
141:	learn: 0.0033562	total: 15.8s	remaining: 2.12s
142:	learn: 0.0033385	total: 15.9s	remaining: 2s
143:	learn: 0.0032807	total: 16s	remaining: 1.89s
144:	learn: 0.0032257	total: 16.1s	remaining: 1.78s
145:	learn: 0.0032066	total: 16.2s	remaining: 1.66s
146:	learn: 0.0031671	total: 16.3s	remaining: 1.55s
147:	learn: 0.0031458	total: 16.4s	remaining: 1.44s
148:	learn: 0.0031134	total: 16.5s	remaining: 1.33s
149:	learn: 0.0031008	total: 16.6s	remaining: 1.22s
150:	learn: 0.0031008	total: 16.6s	remaining: 1.1s
151:	learn: 0.0030778	total: 16.7s	remaining: 990ms
152:	learn: 0.0030778	total: 16.8s	remaining: 879ms
153:	learn: 0.0030424	total: 16.9s	remaining: 769ms
154:	learn: 0.0030071	total: 17s	remaining: 658ms
155:	learn: 0.0029735	total: 17.1s	remaining: 548ms
156:	learn: 0.0029457	total: 17.2s	remaining: 438ms
157:	learn: 0.0029457	total: 17.3s	remaining: 328ms
158:	learn: 0.0029457	total: 17.3s	remaining: 218ms
159:	learn: 0.0029457	total: 17.4s	remaining: 109ms
160:	learn: 0.0029298	total: 17.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.13
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.56
 - F1-Score_Train: 99.56
 - Precision_Test: 13.02
 - Recall_Test: 86.51
 - AUPRC_Test: 75.99
 - Accuracy_Test: 99.01
 - F1-Score_Test: 22.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 161
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5231649	total: 85.4ms	remaining: 13.7s
1:	learn: 0.3992002	total: 176ms	remaining: 14s
2:	learn: 0.3138507	total: 266ms	remaining: 14s
3:	learn: 0.2461577	total: 370ms	remaining: 14.5s
4:	learn: 0.2025193	total: 472ms	remaining: 14.7s
5:	learn: 0.1752763	total: 576ms	remaining: 14.9s
6:	learn: 0.1552711	total: 680ms	remaining: 15s
7:	learn: 0.1369309	total: 764ms	remaining: 14.6s
8:	learn: 0.1217553	total: 858ms	remaining: 14.5s
9:	learn: 0.1067203	total: 941ms	remaining: 14.2s
10:	learn: 0.0921446	total: 1.03s	remaining: 14.1s
11:	learn: 0.0862587	total: 1.13s	remaining: 14s
12:	learn: 0.0806701	total: 1.22s	remaining: 13.9s
13:	learn: 0.0750996	total: 1.31s	remaining: 13.8s
14:	learn: 0.0681347	total: 1.41s	remaining: 13.7s
15:	learn: 0.0621803	total: 1.51s	remaining: 13.7s
16:	learn: 0.0578955	total: 1.62s	remaining: 13.7s
17:	learn: 0.0538443	total: 1.72s	remaining: 13.7s
18:	learn: 0.0513727	total: 1.82s	remaining: 13.6s
19:	learn: 0.0484209	total: 1.91s	remaining: 13.5s
20:	learn: 0.0463166	total: 2.01s	remaining: 13.4s
21:	learn: 0.0441441	total: 2.09s	remaining: 13.2s
22:	learn: 0.0420787	total: 2.19s	remaining: 13.2s
23:	learn: 0.0403397	total: 2.28s	remaining: 13s
24:	learn: 0.0389810	total: 2.37s	remaining: 12.9s
25:	learn: 0.0370784	total: 2.48s	remaining: 12.9s
26:	learn: 0.0352596	total: 2.56s	remaining: 12.7s
27:	learn: 0.0331760	total: 2.67s	remaining: 12.7s
28:	learn: 0.0321848	total: 2.78s	remaining: 12.7s
29:	learn: 0.0308793	total: 2.87s	remaining: 12.5s
30:	learn: 0.0298974	total: 2.96s	remaining: 12.4s
31:	learn: 0.0286785	total: 3.06s	remaining: 12.3s
32:	learn: 0.0278217	total: 3.15s	remaining: 12.2s
33:	learn: 0.0270596	total: 3.24s	remaining: 12.1s
34:	learn: 0.0265231	total: 3.34s	remaining: 12s
35:	learn: 0.0253481	total: 3.43s	remaining: 11.9s
36:	learn: 0.0248476	total: 3.52s	remaining: 11.8s
37:	learn: 0.0241619	total: 3.63s	remaining: 11.7s
38:	learn: 0.0235442	total: 3.72s	remaining: 11.6s
39:	learn: 0.0229498	total: 3.81s	remaining: 11.5s
40:	learn: 0.0224493	total: 3.92s	remaining: 11.5s
41:	learn: 0.0219723	total: 4s	remaining: 11.3s
42:	learn: 0.0215276	total: 4.08s	remaining: 11.2s
43:	learn: 0.0207797	total: 4.18s	remaining: 11.1s
44:	learn: 0.0202822	total: 4.27s	remaining: 11s
45:	learn: 0.0198435	total: 4.36s	remaining: 10.9s
46:	learn: 0.0196252	total: 4.44s	remaining: 10.8s
47:	learn: 0.0191722	total: 4.54s	remaining: 10.7s
48:	learn: 0.0188502	total: 4.65s	remaining: 10.6s
49:	learn: 0.0184373	total: 4.81s	remaining: 10.7s
50:	learn: 0.0179457	total: 4.97s	remaining: 10.7s
51:	learn: 0.0175925	total: 5.12s	remaining: 10.7s
52:	learn: 0.0172556	total: 5.3s	remaining: 10.8s
53:	learn: 0.0168094	total: 5.46s	remaining: 10.8s
54:	learn: 0.0166331	total: 5.63s	remaining: 10.8s
55:	learn: 0.0163550	total: 5.78s	remaining: 10.8s
56:	learn: 0.0160140	total: 5.95s	remaining: 10.9s
57:	learn: 0.0157460	total: 6.13s	remaining: 10.9s
58:	learn: 0.0155043	total: 6.29s	remaining: 10.9s
59:	learn: 0.0151382	total: 6.45s	remaining: 10.9s
60:	learn: 0.0147998	total: 6.62s	remaining: 10.9s
61:	learn: 0.0146104	total: 6.79s	remaining: 10.8s
62:	learn: 0.0143995	total: 6.96s	remaining: 10.8s
63:	learn: 0.0141085	total: 7.15s	remaining: 10.8s
64:	learn: 0.0138980	total: 7.32s	remaining: 10.8s
65:	learn: 0.0136428	total: 7.5s	remaining: 10.8s
66:	learn: 0.0133176	total: 7.69s	remaining: 10.8s
67:	learn: 0.0131516	total: 7.88s	remaining: 10.8s
68:	learn: 0.0129630	total: 8.04s	remaining: 10.7s
69:	learn: 0.0127407	total: 8.2s	remaining: 10.7s
70:	learn: 0.0125758	total: 8.36s	remaining: 10.6s
71:	learn: 0.0124419	total: 8.51s	remaining: 10.5s
72:	learn: 0.0120858	total: 8.69s	remaining: 10.5s
73:	learn: 0.0119170	total: 8.86s	remaining: 10.4s
74:	learn: 0.0116697	total: 9.05s	remaining: 10.4s
75:	learn: 0.0114803	total: 9.22s	remaining: 10.3s
76:	learn: 0.0112472	total: 9.4s	remaining: 10.3s
77:	learn: 0.0110951	total: 9.57s	remaining: 10.2s
78:	learn: 0.0109586	total: 9.73s	remaining: 10.1s
79:	learn: 0.0108413	total: 9.83s	remaining: 9.96s
80:	learn: 0.0105916	total: 9.92s	remaining: 9.8s
81:	learn: 0.0104421	total: 10s	remaining: 9.66s
82:	learn: 0.0103538	total: 10.1s	remaining: 9.5s
83:	learn: 0.0102203	total: 10.2s	remaining: 9.36s
84:	learn: 0.0100742	total: 10.3s	remaining: 9.22s
85:	learn: 0.0099470	total: 10.4s	remaining: 9.07s
86:	learn: 0.0098075	total: 10.5s	remaining: 8.91s
87:	learn: 0.0096548	total: 10.6s	remaining: 8.77s
88:	learn: 0.0094466	total: 10.7s	remaining: 8.63s
89:	learn: 0.0092842	total: 10.8s	remaining: 8.49s
90:	learn: 0.0091397	total: 10.9s	remaining: 8.37s
91:	learn: 0.0090505	total: 11s	remaining: 8.22s
92:	learn: 0.0089815	total: 11.1s	remaining: 8.08s
93:	learn: 0.0088794	total: 11.2s	remaining: 7.96s
94:	learn: 0.0087838	total: 11.2s	remaining: 7.81s
95:	learn: 0.0086932	total: 11.3s	remaining: 7.67s
96:	learn: 0.0085989	total: 11.4s	remaining: 7.54s
97:	learn: 0.0084459	total: 11.5s	remaining: 7.41s
98:	learn: 0.0082505	total: 11.6s	remaining: 7.28s
99:	learn: 0.0080540	total: 11.7s	remaining: 7.15s
100:	learn: 0.0079698	total: 11.8s	remaining: 7.01s
101:	learn: 0.0078755	total: 11.9s	remaining: 6.88s
102:	learn: 0.0077784	total: 12s	remaining: 6.75s
103:	learn: 0.0076159	total: 12.1s	remaining: 6.63s
104:	learn: 0.0075022	total: 12.2s	remaining: 6.5s
105:	learn: 0.0074266	total: 12.3s	remaining: 6.37s
106:	learn: 0.0073363	total: 12.4s	remaining: 6.24s
107:	learn: 0.0072732	total: 12.5s	remaining: 6.11s
108:	learn: 0.0071729	total: 12.6s	remaining: 5.99s
109:	learn: 0.0070629	total: 12.6s	remaining: 5.86s
110:	learn: 0.0069374	total: 12.7s	remaining: 5.73s
111:	learn: 0.0068817	total: 12.8s	remaining: 5.61s
112:	learn: 0.0068462	total: 12.9s	remaining: 5.48s
113:	learn: 0.0067471	total: 13s	remaining: 5.36s
114:	learn: 0.0066963	total: 13.1s	remaining: 5.24s
115:	learn: 0.0066464	total: 13.2s	remaining: 5.11s
116:	learn: 0.0065674	total: 13.3s	remaining: 4.99s
117:	learn: 0.0065171	total: 13.4s	remaining: 4.87s
118:	learn: 0.0064396	total: 13.5s	remaining: 4.75s
119:	learn: 0.0063319	total: 13.6s	remaining: 4.63s
120:	learn: 0.0062148	total: 13.7s	remaining: 4.51s
121:	learn: 0.0061956	total: 13.7s	remaining: 4.39s
122:	learn: 0.0061311	total: 13.8s	remaining: 4.27s
123:	learn: 0.0060522	total: 13.9s	remaining: 4.15s
124:	learn: 0.0060010	total: 14s	remaining: 4.03s
125:	learn: 0.0059175	total: 14.1s	remaining: 3.92s
126:	learn: 0.0057873	total: 14.2s	remaining: 3.8s
127:	learn: 0.0057154	total: 14.3s	remaining: 3.68s
128:	learn: 0.0056057	total: 14.4s	remaining: 3.57s
129:	learn: 0.0055418	total: 14.5s	remaining: 3.45s
130:	learn: 0.0055029	total: 14.6s	remaining: 3.33s
131:	learn: 0.0054464	total: 14.6s	remaining: 3.22s
132:	learn: 0.0054285	total: 14.7s	remaining: 3.1s
133:	learn: 0.0053967	total: 14.8s	remaining: 2.99s
134:	learn: 0.0053730	total: 14.9s	remaining: 2.87s
135:	learn: 0.0052956	total: 15s	remaining: 2.76s
136:	learn: 0.0052672	total: 15.1s	remaining: 2.64s
137:	learn: 0.0052299	total: 15.2s	remaining: 2.53s
138:	learn: 0.0052116	total: 15.3s	remaining: 2.42s
139:	learn: 0.0051041	total: 15.4s	remaining: 2.31s
140:	learn: 0.0050379	total: 15.5s	remaining: 2.19s
141:	learn: 0.0050170	total: 15.5s	remaining: 2.08s
142:	learn: 0.0049543	total: 15.6s	remaining: 1.97s
143:	learn: 0.0048997	total: 15.7s	remaining: 1.86s
144:	learn: 0.0048277	total: 15.8s	remaining: 1.75s
145:	learn: 0.0047955	total: 15.9s	remaining: 1.64s
146:	learn: 0.0047461	total: 16s	remaining: 1.52s
147:	learn: 0.0047009	total: 16.1s	remaining: 1.41s
148:	learn: 0.0046715	total: 16.2s	remaining: 1.3s
149:	learn: 0.0046456	total: 16.3s	remaining: 1.19s
150:	learn: 0.0046189	total: 16.4s	remaining: 1.08s
151:	learn: 0.0045857	total: 16.5s	remaining: 975ms
152:	learn: 0.0045455	total: 16.5s	remaining: 865ms
153:	learn: 0.0045182	total: 16.6s	remaining: 756ms
154:	learn: 0.0044874	total: 16.7s	remaining: 647ms
155:	learn: 0.0044576	total: 16.8s	remaining: 539ms
156:	learn: 0.0044123	total: 16.9s	remaining: 431ms
157:	learn: 0.0043227	total: 17s	remaining: 323ms
158:	learn: 0.0042654	total: 17.1s	remaining: 215ms
159:	learn: 0.0042237	total: 17.2s	remaining: 108ms
160:	learn: 0.0041972	total: 17.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.85
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.42
 - F1-Score_Train: 99.42
 - Precision_Test: 10.89
 - Recall_Test: 89.68
 - AUPRC_Test: 71.62
 - Accuracy_Test: 98.75
 - F1-Score_Test: 19.42
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 161
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5290729	total: 80.1ms	remaining: 12.8s
1:	learn: 0.3772317	total: 171ms	remaining: 13.6s
2:	learn: 0.2865018	total: 264ms	remaining: 13.9s
3:	learn: 0.2241095	total: 367ms	remaining: 14.4s
4:	learn: 0.1799570	total: 503ms	remaining: 15.7s
5:	learn: 0.1513871	total: 636ms	remaining: 16.4s
6:	learn: 0.1316595	total: 801ms	remaining: 17.6s
7:	learn: 0.1065176	total: 994ms	remaining: 19s
8:	learn: 0.0890342	total: 1.18s	remaining: 20s
9:	learn: 0.0806646	total: 1.37s	remaining: 20.7s
10:	learn: 0.0728335	total: 1.54s	remaining: 21.1s
11:	learn: 0.0649853	total: 1.73s	remaining: 21.5s
12:	learn: 0.0603990	total: 1.9s	remaining: 21.6s
13:	learn: 0.0568277	total: 2.08s	remaining: 21.8s
14:	learn: 0.0527847	total: 2.25s	remaining: 21.9s
15:	learn: 0.0502236	total: 2.43s	remaining: 22s
16:	learn: 0.0466395	total: 2.58s	remaining: 21.9s
17:	learn: 0.0442140	total: 2.77s	remaining: 22s
18:	learn: 0.0423690	total: 2.92s	remaining: 21.8s
19:	learn: 0.0409042	total: 3.1s	remaining: 21.9s
20:	learn: 0.0386514	total: 3.29s	remaining: 21.9s
21:	learn: 0.0369621	total: 3.48s	remaining: 22s
22:	learn: 0.0352891	total: 3.66s	remaining: 22s
23:	learn: 0.0341164	total: 3.84s	remaining: 21.9s
24:	learn: 0.0325056	total: 4.02s	remaining: 21.9s
25:	learn: 0.0313300	total: 4.18s	remaining: 21.7s
26:	learn: 0.0300369	total: 4.35s	remaining: 21.6s
27:	learn: 0.0291046	total: 4.54s	remaining: 21.6s
28:	learn: 0.0278451	total: 4.71s	remaining: 21.5s
29:	learn: 0.0270582	total: 4.89s	remaining: 21.4s
30:	learn: 0.0262292	total: 5.07s	remaining: 21.3s
31:	learn: 0.0253505	total: 5.26s	remaining: 21.2s
32:	learn: 0.0243269	total: 5.44s	remaining: 21.1s
33:	learn: 0.0235307	total: 5.63s	remaining: 21s
34:	learn: 0.0230547	total: 5.77s	remaining: 20.8s
35:	learn: 0.0224470	total: 5.88s	remaining: 20.4s
36:	learn: 0.0218893	total: 5.96s	remaining: 20s
37:	learn: 0.0213409	total: 6.05s	remaining: 19.6s
38:	learn: 0.0210623	total: 6.15s	remaining: 19.2s
39:	learn: 0.0204074	total: 6.25s	remaining: 18.9s
40:	learn: 0.0198208	total: 6.34s	remaining: 18.6s
41:	learn: 0.0192985	total: 6.45s	remaining: 18.3s
42:	learn: 0.0187417	total: 6.54s	remaining: 17.9s
43:	learn: 0.0183924	total: 6.63s	remaining: 17.6s
44:	learn: 0.0180418	total: 6.72s	remaining: 17.3s
45:	learn: 0.0174996	total: 6.82s	remaining: 17s
46:	learn: 0.0170838	total: 6.91s	remaining: 16.8s
47:	learn: 0.0168230	total: 7.01s	remaining: 16.5s
48:	learn: 0.0164365	total: 7.09s	remaining: 16.2s
49:	learn: 0.0161910	total: 7.17s	remaining: 15.9s
50:	learn: 0.0157728	total: 7.29s	remaining: 15.7s
51:	learn: 0.0155011	total: 7.38s	remaining: 15.5s
52:	learn: 0.0151247	total: 7.47s	remaining: 15.2s
53:	learn: 0.0146962	total: 7.58s	remaining: 15s
54:	learn: 0.0144935	total: 7.66s	remaining: 14.8s
55:	learn: 0.0142728	total: 7.75s	remaining: 14.5s
56:	learn: 0.0140447	total: 7.86s	remaining: 14.3s
57:	learn: 0.0139217	total: 7.93s	remaining: 14.1s
58:	learn: 0.0135197	total: 8.03s	remaining: 13.9s
59:	learn: 0.0131754	total: 8.13s	remaining: 13.7s
60:	learn: 0.0130349	total: 8.21s	remaining: 13.5s
61:	learn: 0.0129228	total: 8.31s	remaining: 13.3s
62:	learn: 0.0126285	total: 8.42s	remaining: 13.1s
63:	learn: 0.0125204	total: 8.5s	remaining: 12.9s
64:	learn: 0.0123285	total: 8.59s	remaining: 12.7s
65:	learn: 0.0120656	total: 8.68s	remaining: 12.5s
66:	learn: 0.0117299	total: 8.78s	remaining: 12.3s
67:	learn: 0.0115492	total: 8.87s	remaining: 12.1s
68:	learn: 0.0113241	total: 8.97s	remaining: 12s
69:	learn: 0.0110539	total: 9.06s	remaining: 11.8s
70:	learn: 0.0109269	total: 9.15s	remaining: 11.6s
71:	learn: 0.0107566	total: 9.25s	remaining: 11.4s
72:	learn: 0.0106386	total: 9.36s	remaining: 11.3s
73:	learn: 0.0104067	total: 9.45s	remaining: 11.1s
74:	learn: 0.0102994	total: 9.54s	remaining: 10.9s
75:	learn: 0.0101795	total: 9.64s	remaining: 10.8s
76:	learn: 0.0099327	total: 9.73s	remaining: 10.6s
77:	learn: 0.0097732	total: 9.82s	remaining: 10.5s
78:	learn: 0.0096446	total: 9.91s	remaining: 10.3s
79:	learn: 0.0094627	total: 10s	remaining: 10.1s
80:	learn: 0.0092875	total: 10.1s	remaining: 9.99s
81:	learn: 0.0092182	total: 10.2s	remaining: 9.82s
82:	learn: 0.0091073	total: 10.3s	remaining: 9.66s
83:	learn: 0.0090075	total: 10.4s	remaining: 9.52s
84:	learn: 0.0088746	total: 10.5s	remaining: 9.37s
85:	learn: 0.0088038	total: 10.6s	remaining: 9.21s
86:	learn: 0.0086956	total: 10.7s	remaining: 9.06s
87:	learn: 0.0085639	total: 10.7s	remaining: 8.91s
88:	learn: 0.0083572	total: 10.8s	remaining: 8.77s
89:	learn: 0.0082857	total: 10.9s	remaining: 8.62s
90:	learn: 0.0081201	total: 11s	remaining: 8.48s
91:	learn: 0.0080368	total: 11.1s	remaining: 8.32s
92:	learn: 0.0078955	total: 11.2s	remaining: 8.19s
93:	learn: 0.0077743	total: 11.3s	remaining: 8.05s
94:	learn: 0.0076431	total: 11.4s	remaining: 7.92s
95:	learn: 0.0075532	total: 11.5s	remaining: 7.78s
96:	learn: 0.0074801	total: 11.6s	remaining: 7.64s
97:	learn: 0.0074342	total: 11.6s	remaining: 7.49s
98:	learn: 0.0072929	total: 11.7s	remaining: 7.36s
99:	learn: 0.0071973	total: 11.8s	remaining: 7.22s
100:	learn: 0.0070818	total: 11.9s	remaining: 7.08s
101:	learn: 0.0070229	total: 12s	remaining: 6.95s
102:	learn: 0.0069277	total: 12.1s	remaining: 6.82s
103:	learn: 0.0067945	total: 12.2s	remaining: 6.68s
104:	learn: 0.0066564	total: 12.3s	remaining: 6.56s
105:	learn: 0.0065261	total: 12.4s	remaining: 6.44s
106:	learn: 0.0064063	total: 12.5s	remaining: 6.31s
107:	learn: 0.0063650	total: 12.6s	remaining: 6.18s
108:	learn: 0.0062637	total: 12.7s	remaining: 6.05s
109:	learn: 0.0061853	total: 12.8s	remaining: 5.93s
110:	learn: 0.0061413	total: 12.9s	remaining: 5.8s
111:	learn: 0.0060527	total: 13s	remaining: 5.67s
112:	learn: 0.0059852	total: 13.1s	remaining: 5.55s
113:	learn: 0.0059464	total: 13.2s	remaining: 5.42s
114:	learn: 0.0058613	total: 13.2s	remaining: 5.3s
115:	learn: 0.0057840	total: 13.4s	remaining: 5.18s
116:	learn: 0.0057271	total: 13.5s	remaining: 5.06s
117:	learn: 0.0056778	total: 13.5s	remaining: 4.93s
118:	learn: 0.0055778	total: 13.6s	remaining: 4.81s
119:	learn: 0.0054728	total: 13.7s	remaining: 4.69s
120:	learn: 0.0054330	total: 13.8s	remaining: 4.57s
121:	learn: 0.0053479	total: 13.9s	remaining: 4.45s
122:	learn: 0.0052882	total: 14s	remaining: 4.33s
123:	learn: 0.0052360	total: 14.1s	remaining: 4.21s
124:	learn: 0.0052151	total: 14.2s	remaining: 4.09s
125:	learn: 0.0051537	total: 14.3s	remaining: 3.96s
126:	learn: 0.0051041	total: 14.4s	remaining: 3.85s
127:	learn: 0.0050355	total: 14.5s	remaining: 3.74s
128:	learn: 0.0049850	total: 14.6s	remaining: 3.62s
129:	learn: 0.0049622	total: 14.7s	remaining: 3.49s
130:	learn: 0.0048999	total: 14.8s	remaining: 3.38s
131:	learn: 0.0048705	total: 14.8s	remaining: 3.26s
132:	learn: 0.0048705	total: 14.9s	remaining: 3.14s
133:	learn: 0.0048390	total: 15s	remaining: 3.02s
134:	learn: 0.0048001	total: 15.1s	remaining: 2.91s
135:	learn: 0.0047691	total: 15.2s	remaining: 2.79s
136:	learn: 0.0047124	total: 15.3s	remaining: 2.68s
137:	learn: 0.0046745	total: 15.4s	remaining: 2.56s
138:	learn: 0.0046286	total: 15.5s	remaining: 2.45s
139:	learn: 0.0045968	total: 15.6s	remaining: 2.34s
140:	learn: 0.0045672	total: 15.7s	remaining: 2.22s
141:	learn: 0.0045051	total: 15.8s	remaining: 2.11s
142:	learn: 0.0044471	total: 15.9s	remaining: 2s
143:	learn: 0.0044192	total: 16.1s	remaining: 1.9s
144:	learn: 0.0043877	total: 16.2s	remaining: 1.79s
145:	learn: 0.0043344	total: 16.4s	remaining: 1.68s
146:	learn: 0.0042962	total: 16.6s	remaining: 1.58s
147:	learn: 0.0042269	total: 16.7s	remaining: 1.47s
148:	learn: 0.0042029	total: 16.9s	remaining: 1.36s
149:	learn: 0.0041748	total: 17.1s	remaining: 1.25s
150:	learn: 0.0041113	total: 17.3s	remaining: 1.14s
151:	learn: 0.0040782	total: 17.4s	remaining: 1.03s
152:	learn: 0.0040542	total: 17.6s	remaining: 920ms
153:	learn: 0.0040248	total: 17.8s	remaining: 807ms
154:	learn: 0.0039507	total: 17.9s	remaining: 694ms
155:	learn: 0.0038943	total: 18.1s	remaining: 580ms
156:	learn: 0.0038431	total: 18.3s	remaining: 466ms
157:	learn: 0.0038041	total: 18.5s	remaining: 350ms
158:	learn: 0.0037845	total: 18.6s	remaining: 234ms
159:	learn: 0.0037349	total: 18.8s	remaining: 117ms
160:	learn: 0.0036808	total: 19s	remaining: 0us
[I 2024-12-19 14:21:11,754] Trial 12 finished with value: 72.37429312947954 and parameters: {'learning_rate': 0.06363954218996214, 'max_depth': 6, 'n_estimators': 161, 'scale_pos_weight': 14.99247144578231}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.86
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.43
 - F1-Score_Train: 99.43
 - Precision_Test: 10.67
 - Recall_Test: 88.10
 - AUPRC_Test: 69.51
 - Accuracy_Test: 98.74
 - F1-Score_Test: 19.04
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 161
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 14.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 72.3743

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5311492	total: 92.8ms	remaining: 9.56s
1:	learn: 0.3999432	total: 193ms	remaining: 9.84s
2:	learn: 0.3064717	total: 282ms	remaining: 9.5s
3:	learn: 0.2240020	total: 388ms	remaining: 9.69s
4:	learn: 0.1742127	total: 483ms	remaining: 9.57s
5:	learn: 0.1464662	total: 563ms	remaining: 9.19s
6:	learn: 0.1162967	total: 668ms	remaining: 9.26s
7:	learn: 0.1034400	total: 765ms	remaining: 9.18s
8:	learn: 0.0880208	total: 857ms	remaining: 9.05s
9:	learn: 0.0771544	total: 959ms	remaining: 9.01s
10:	learn: 0.0698841	total: 1.06s	remaining: 8.93s
11:	learn: 0.0623489	total: 1.18s	remaining: 9.03s
12:	learn: 0.0557042	total: 1.28s	remaining: 8.99s
13:	learn: 0.0494129	total: 1.38s	remaining: 8.86s
14:	learn: 0.0452637	total: 1.49s	remaining: 8.82s
15:	learn: 0.0422083	total: 1.59s	remaining: 8.74s
16:	learn: 0.0399911	total: 1.68s	remaining: 8.58s
17:	learn: 0.0380197	total: 1.78s	remaining: 8.51s
18:	learn: 0.0360031	total: 1.88s	remaining: 8.4s
19:	learn: 0.0340078	total: 1.96s	remaining: 8.24s
20:	learn: 0.0320822	total: 2.06s	remaining: 8.15s
21:	learn: 0.0306177	total: 2.16s	remaining: 8.05s
22:	learn: 0.0289592	total: 2.28s	remaining: 8.02s
23:	learn: 0.0277678	total: 2.37s	remaining: 7.9s
24:	learn: 0.0265636	total: 2.46s	remaining: 7.77s
25:	learn: 0.0254185	total: 2.57s	remaining: 7.72s
26:	learn: 0.0247896	total: 2.65s	remaining: 7.57s
27:	learn: 0.0238587	total: 2.75s	remaining: 7.45s
28:	learn: 0.0228435	total: 2.85s	remaining: 7.36s
29:	learn: 0.0222240	total: 2.94s	remaining: 7.25s
30:	learn: 0.0215783	total: 3.03s	remaining: 7.13s
31:	learn: 0.0210460	total: 3.13s	remaining: 7.04s
32:	learn: 0.0206521	total: 3.22s	remaining: 6.94s
33:	learn: 0.0200961	total: 3.31s	remaining: 6.83s
34:	learn: 0.0195178	total: 3.42s	remaining: 6.74s
35:	learn: 0.0188982	total: 3.5s	remaining: 6.62s
36:	learn: 0.0183292	total: 3.6s	remaining: 6.52s
37:	learn: 0.0177888	total: 3.71s	remaining: 6.43s
38:	learn: 0.0172826	total: 3.8s	remaining: 6.33s
39:	learn: 0.0168692	total: 3.89s	remaining: 6.22s
40:	learn: 0.0164086	total: 3.99s	remaining: 6.13s
41:	learn: 0.0160953	total: 4.08s	remaining: 6.02s
42:	learn: 0.0156564	total: 4.17s	remaining: 5.91s
43:	learn: 0.0153069	total: 4.28s	remaining: 5.84s
44:	learn: 0.0148972	total: 4.37s	remaining: 5.72s
45:	learn: 0.0144680	total: 4.45s	remaining: 5.61s
46:	learn: 0.0141943	total: 4.55s	remaining: 5.52s
47:	learn: 0.0139680	total: 4.63s	remaining: 5.4s
48:	learn: 0.0136539	total: 4.72s	remaining: 5.3s
49:	learn: 0.0131834	total: 4.83s	remaining: 5.21s
50:	learn: 0.0127227	total: 4.92s	remaining: 5.12s
51:	learn: 0.0123783	total: 5.02s	remaining: 5.02s
52:	learn: 0.0121111	total: 5.12s	remaining: 4.93s
53:	learn: 0.0119155	total: 5.21s	remaining: 4.82s
54:	learn: 0.0115419	total: 5.31s	remaining: 4.73s
55:	learn: 0.0112065	total: 5.43s	remaining: 4.65s
56:	learn: 0.0109585	total: 5.52s	remaining: 4.55s
57:	learn: 0.0108075	total: 5.6s	remaining: 4.44s
58:	learn: 0.0104877	total: 5.71s	remaining: 4.36s
59:	learn: 0.0103227	total: 5.79s	remaining: 4.25s
60:	learn: 0.0101836	total: 5.88s	remaining: 4.15s
61:	learn: 0.0100667	total: 5.98s	remaining: 4.05s
62:	learn: 0.0098415	total: 6.07s	remaining: 3.95s
63:	learn: 0.0096838	total: 6.16s	remaining: 3.85s
64:	learn: 0.0095700	total: 6.25s	remaining: 3.75s
65:	learn: 0.0094122	total: 6.36s	remaining: 3.66s
66:	learn: 0.0092374	total: 6.45s	remaining: 3.56s
67:	learn: 0.0090892	total: 6.55s	remaining: 3.47s
68:	learn: 0.0089258	total: 6.64s	remaining: 3.37s
69:	learn: 0.0088177	total: 6.74s	remaining: 3.27s
70:	learn: 0.0086803	total: 6.85s	remaining: 3.18s
71:	learn: 0.0085288	total: 6.94s	remaining: 3.09s
72:	learn: 0.0084596	total: 7.02s	remaining: 2.98s
73:	learn: 0.0082970	total: 7.12s	remaining: 2.89s
74:	learn: 0.0082197	total: 7.2s	remaining: 2.78s
75:	learn: 0.0080865	total: 7.28s	remaining: 2.68s
76:	learn: 0.0079153	total: 7.4s	remaining: 2.59s
77:	learn: 0.0077166	total: 7.49s	remaining: 2.5s
78:	learn: 0.0076474	total: 7.59s	remaining: 2.4s
79:	learn: 0.0075773	total: 7.69s	remaining: 2.31s
80:	learn: 0.0074577	total: 7.78s	remaining: 2.21s
81:	learn: 0.0073361	total: 7.87s	remaining: 2.11s
82:	learn: 0.0071450	total: 7.97s	remaining: 2.02s
83:	learn: 0.0070225	total: 8.06s	remaining: 1.92s
84:	learn: 0.0068985	total: 8.15s	remaining: 1.82s
85:	learn: 0.0067646	total: 8.25s	remaining: 1.73s
86:	learn: 0.0066928	total: 8.33s	remaining: 1.63s
87:	learn: 0.0066456	total: 8.43s	remaining: 1.53s
88:	learn: 0.0065574	total: 8.53s	remaining: 1.44s
89:	learn: 0.0064052	total: 8.63s	remaining: 1.34s
90:	learn: 0.0063608	total: 8.72s	remaining: 1.25s
91:	learn: 0.0063367	total: 8.8s	remaining: 1.15s
92:	learn: 0.0063074	total: 8.88s	remaining: 1.05s
93:	learn: 0.0062430	total: 8.96s	remaining: 954ms
94:	learn: 0.0061129	total: 9.07s	remaining: 859ms
95:	learn: 0.0060388	total: 9.19s	remaining: 766ms
96:	learn: 0.0059647	total: 9.35s	remaining: 674ms
97:	learn: 0.0058901	total: 9.49s	remaining: 581ms
98:	learn: 0.0057920	total: 9.66s	remaining: 488ms
99:	learn: 0.0056629	total: 9.83s	remaining: 393ms
100:	learn: 0.0055603	total: 10s	remaining: 298ms
101:	learn: 0.0054961	total: 10.2s	remaining: 200ms
102:	learn: 0.0053979	total: 10.4s	remaining: 101ms
103:	learn: 0.0053418	total: 10.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.67
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.33
 - F1-Score_Train: 99.33
 - Precision_Test: 9.54
 - Recall_Test: 88.10
 - AUPRC_Test: 76.39
 - Accuracy_Test: 98.58
 - F1-Score_Test: 17.22
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 104
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.72
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5237014	total: 171ms	remaining: 17.6s
1:	learn: 0.3887235	total: 346ms	remaining: 17.7s
2:	learn: 0.3061664	total: 523ms	remaining: 17.6s
3:	learn: 0.2516754	total: 615ms	remaining: 15.4s
4:	learn: 0.2010913	total: 702ms	remaining: 13.9s
5:	learn: 0.1748055	total: 804ms	remaining: 13.1s
6:	learn: 0.1549169	total: 881ms	remaining: 12.2s
7:	learn: 0.1424425	total: 985ms	remaining: 11.8s
8:	learn: 0.1252698	total: 1.07s	remaining: 11.4s
9:	learn: 0.1059923	total: 1.17s	remaining: 11s
10:	learn: 0.0956049	total: 1.25s	remaining: 10.6s
11:	learn: 0.0858901	total: 1.36s	remaining: 10.4s
12:	learn: 0.0779036	total: 1.46s	remaining: 10.2s
13:	learn: 0.0716052	total: 1.55s	remaining: 9.96s
14:	learn: 0.0662422	total: 1.65s	remaining: 9.82s
15:	learn: 0.0638226	total: 1.75s	remaining: 9.61s
16:	learn: 0.0609718	total: 1.84s	remaining: 9.4s
17:	learn: 0.0574432	total: 1.96s	remaining: 9.35s
18:	learn: 0.0547431	total: 2.06s	remaining: 9.2s
19:	learn: 0.0527094	total: 2.14s	remaining: 9.01s
20:	learn: 0.0512823	total: 2.24s	remaining: 8.87s
21:	learn: 0.0487917	total: 2.33s	remaining: 8.69s
22:	learn: 0.0462655	total: 2.43s	remaining: 8.56s
23:	learn: 0.0444837	total: 2.53s	remaining: 8.43s
24:	learn: 0.0423767	total: 2.62s	remaining: 8.29s
25:	learn: 0.0406984	total: 2.71s	remaining: 8.12s
26:	learn: 0.0384813	total: 2.81s	remaining: 8.03s
27:	learn: 0.0365418	total: 2.9s	remaining: 7.88s
28:	learn: 0.0346662	total: 3.01s	remaining: 7.79s
29:	learn: 0.0334452	total: 3.11s	remaining: 7.68s
30:	learn: 0.0323658	total: 3.21s	remaining: 7.55s
31:	learn: 0.0317060	total: 3.29s	remaining: 7.41s
32:	learn: 0.0306482	total: 3.39s	remaining: 7.29s
33:	learn: 0.0293460	total: 3.48s	remaining: 7.16s
34:	learn: 0.0281146	total: 3.57s	remaining: 7.05s
35:	learn: 0.0271710	total: 3.68s	remaining: 6.95s
36:	learn: 0.0264519	total: 3.77s	remaining: 6.82s
37:	learn: 0.0258495	total: 3.86s	remaining: 6.7s
38:	learn: 0.0251277	total: 3.96s	remaining: 6.6s
39:	learn: 0.0245427	total: 4.07s	remaining: 6.51s
40:	learn: 0.0242345	total: 4.15s	remaining: 6.37s
41:	learn: 0.0235774	total: 4.25s	remaining: 6.27s
42:	learn: 0.0231241	total: 4.33s	remaining: 6.14s
43:	learn: 0.0223777	total: 4.42s	remaining: 6.03s
44:	learn: 0.0219078	total: 4.53s	remaining: 5.94s
45:	learn: 0.0214388	total: 4.62s	remaining: 5.83s
46:	learn: 0.0208376	total: 4.71s	remaining: 5.71s
47:	learn: 0.0201781	total: 4.82s	remaining: 5.63s
48:	learn: 0.0198900	total: 4.9s	remaining: 5.5s
49:	learn: 0.0194910	total: 5s	remaining: 5.4s
50:	learn: 0.0191103	total: 5.11s	remaining: 5.31s
51:	learn: 0.0186860	total: 5.19s	remaining: 5.19s
52:	learn: 0.0183714	total: 5.27s	remaining: 5.08s
53:	learn: 0.0180737	total: 5.37s	remaining: 4.97s
54:	learn: 0.0176965	total: 5.46s	remaining: 4.86s
55:	learn: 0.0172671	total: 5.54s	remaining: 4.75s
56:	learn: 0.0169175	total: 5.64s	remaining: 4.65s
57:	learn: 0.0164764	total: 5.73s	remaining: 4.54s
58:	learn: 0.0162932	total: 5.81s	remaining: 4.43s
59:	learn: 0.0159890	total: 5.91s	remaining: 4.34s
60:	learn: 0.0156629	total: 6s	remaining: 4.23s
61:	learn: 0.0154721	total: 6.11s	remaining: 4.14s
62:	learn: 0.0151791	total: 6.2s	remaining: 4.03s
63:	learn: 0.0149687	total: 6.28s	remaining: 3.93s
64:	learn: 0.0147465	total: 6.37s	remaining: 3.82s
65:	learn: 0.0143764	total: 6.47s	remaining: 3.73s
66:	learn: 0.0141133	total: 6.56s	remaining: 3.62s
67:	learn: 0.0137633	total: 6.65s	remaining: 3.52s
68:	learn: 0.0134930	total: 6.75s	remaining: 3.42s
69:	learn: 0.0133220	total: 6.83s	remaining: 3.32s
70:	learn: 0.0130859	total: 6.92s	remaining: 3.21s
71:	learn: 0.0128321	total: 7.03s	remaining: 3.12s
72:	learn: 0.0126451	total: 7.13s	remaining: 3.03s
73:	learn: 0.0123475	total: 7.22s	remaining: 2.93s
74:	learn: 0.0121069	total: 7.32s	remaining: 2.83s
75:	learn: 0.0118882	total: 7.41s	remaining: 2.73s
76:	learn: 0.0117681	total: 7.49s	remaining: 2.63s
77:	learn: 0.0116138	total: 7.59s	remaining: 2.53s
78:	learn: 0.0114282	total: 7.67s	remaining: 2.43s
79:	learn: 0.0111927	total: 7.77s	remaining: 2.33s
80:	learn: 0.0109900	total: 7.87s	remaining: 2.23s
81:	learn: 0.0107937	total: 7.95s	remaining: 2.13s
82:	learn: 0.0106523	total: 8.04s	remaining: 2.04s
83:	learn: 0.0105330	total: 8.16s	remaining: 1.94s
84:	learn: 0.0103165	total: 8.25s	remaining: 1.84s
85:	learn: 0.0101992	total: 8.34s	remaining: 1.74s
86:	learn: 0.0100932	total: 8.43s	remaining: 1.65s
87:	learn: 0.0099560	total: 8.51s	remaining: 1.55s
88:	learn: 0.0098251	total: 8.6s	remaining: 1.45s
89:	learn: 0.0097028	total: 8.7s	remaining: 1.35s
90:	learn: 0.0095812	total: 8.78s	remaining: 1.25s
91:	learn: 0.0094575	total: 8.87s	remaining: 1.16s
92:	learn: 0.0092521	total: 8.98s	remaining: 1.06s
93:	learn: 0.0091436	total: 9.06s	remaining: 964ms
94:	learn: 0.0090440	total: 9.16s	remaining: 868ms
95:	learn: 0.0089812	total: 9.24s	remaining: 770ms
96:	learn: 0.0088142	total: 9.33s	remaining: 674ms
97:	learn: 0.0087411	total: 9.42s	remaining: 577ms
98:	learn: 0.0086298	total: 9.52s	remaining: 481ms
99:	learn: 0.0084901	total: 9.6s	remaining: 384ms
100:	learn: 0.0084161	total: 9.69s	remaining: 288ms
101:	learn: 0.0082527	total: 9.79s	remaining: 192ms
102:	learn: 0.0080768	total: 9.88s	remaining: 96ms
103:	learn: 0.0080339	total: 9.96s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.19
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.08
 - F1-Score_Train: 99.09
 - Precision_Test: 7.62
 - Recall_Test: 90.48
 - AUPRC_Test: 72.18
 - Accuracy_Test: 98.14
 - F1-Score_Test: 14.05
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 104
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.72
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5308791	total: 163ms	remaining: 16.8s
1:	learn: 0.3789184	total: 334ms	remaining: 17s
2:	learn: 0.2865040	total: 541ms	remaining: 18.2s
3:	learn: 0.2235826	total: 726ms	remaining: 18.2s
4:	learn: 0.1702052	total: 916ms	remaining: 18.1s
5:	learn: 0.1357218	total: 1.1s	remaining: 17.9s
6:	learn: 0.1159534	total: 1.29s	remaining: 17.9s
7:	learn: 0.1035860	total: 1.45s	remaining: 17.4s
8:	learn: 0.0929205	total: 1.63s	remaining: 17.2s
9:	learn: 0.0840460	total: 1.8s	remaining: 16.9s
10:	learn: 0.0750185	total: 1.98s	remaining: 16.8s
11:	learn: 0.0694961	total: 2.15s	remaining: 16.5s
12:	learn: 0.0634179	total: 2.34s	remaining: 16.4s
13:	learn: 0.0591353	total: 2.53s	remaining: 16.2s
14:	learn: 0.0555655	total: 2.71s	remaining: 16.1s
15:	learn: 0.0527235	total: 2.9s	remaining: 16s
16:	learn: 0.0490049	total: 3.07s	remaining: 15.7s
17:	learn: 0.0465135	total: 3.18s	remaining: 15.2s
18:	learn: 0.0441819	total: 3.28s	remaining: 14.7s
19:	learn: 0.0422195	total: 3.39s	remaining: 14.2s
20:	learn: 0.0403133	total: 3.47s	remaining: 13.7s
21:	learn: 0.0383613	total: 3.58s	remaining: 13.3s
22:	learn: 0.0369345	total: 3.67s	remaining: 12.9s
23:	learn: 0.0352077	total: 3.76s	remaining: 12.5s
24:	learn: 0.0337913	total: 3.88s	remaining: 12.3s
25:	learn: 0.0326932	total: 3.97s	remaining: 11.9s
26:	learn: 0.0314499	total: 4.07s	remaining: 11.6s
27:	learn: 0.0303016	total: 4.17s	remaining: 11.3s
28:	learn: 0.0292845	total: 4.26s	remaining: 11s
29:	learn: 0.0285196	total: 4.36s	remaining: 10.8s
30:	learn: 0.0275941	total: 4.46s	remaining: 10.5s
31:	learn: 0.0266916	total: 4.56s	remaining: 10.3s
32:	learn: 0.0260857	total: 4.65s	remaining: 10s
33:	learn: 0.0255658	total: 4.75s	remaining: 9.79s
34:	learn: 0.0250253	total: 4.84s	remaining: 9.54s
35:	learn: 0.0245436	total: 4.93s	remaining: 9.31s
36:	learn: 0.0238386	total: 5.04s	remaining: 9.12s
37:	learn: 0.0229895	total: 5.12s	remaining: 8.9s
38:	learn: 0.0225077	total: 5.21s	remaining: 8.68s
39:	learn: 0.0218205	total: 5.31s	remaining: 8.49s
40:	learn: 0.0211297	total: 5.4s	remaining: 8.29s
41:	learn: 0.0207373	total: 5.47s	remaining: 8.08s
42:	learn: 0.0202566	total: 5.58s	remaining: 7.92s
43:	learn: 0.0196015	total: 5.67s	remaining: 7.74s
44:	learn: 0.0190786	total: 5.76s	remaining: 7.55s
45:	learn: 0.0184680	total: 5.88s	remaining: 7.42s
46:	learn: 0.0182140	total: 5.97s	remaining: 7.24s
47:	learn: 0.0178328	total: 6.06s	remaining: 7.07s
48:	learn: 0.0174756	total: 6.15s	remaining: 6.9s
49:	learn: 0.0171069	total: 6.24s	remaining: 6.74s
50:	learn: 0.0166522	total: 6.33s	remaining: 6.57s
51:	learn: 0.0162691	total: 6.43s	remaining: 6.43s
52:	learn: 0.0160334	total: 6.53s	remaining: 6.28s
53:	learn: 0.0156379	total: 6.62s	remaining: 6.13s
54:	learn: 0.0153008	total: 6.72s	remaining: 5.99s
55:	learn: 0.0150510	total: 6.79s	remaining: 5.82s
56:	learn: 0.0149064	total: 6.88s	remaining: 5.67s
57:	learn: 0.0145987	total: 6.99s	remaining: 5.55s
58:	learn: 0.0144277	total: 7.08s	remaining: 5.4s
59:	learn: 0.0140912	total: 7.17s	remaining: 5.26s
60:	learn: 0.0139354	total: 7.27s	remaining: 5.12s
61:	learn: 0.0138235	total: 7.35s	remaining: 4.98s
62:	learn: 0.0135093	total: 7.44s	remaining: 4.84s
63:	learn: 0.0132602	total: 7.54s	remaining: 4.71s
64:	learn: 0.0130945	total: 7.62s	remaining: 4.57s
65:	learn: 0.0129242	total: 7.71s	remaining: 4.44s
66:	learn: 0.0126522	total: 7.8s	remaining: 4.31s
67:	learn: 0.0123141	total: 7.9s	remaining: 4.18s
68:	learn: 0.0122223	total: 7.99s	remaining: 4.05s
69:	learn: 0.0118912	total: 8.1s	remaining: 3.93s
70:	learn: 0.0116528	total: 8.19s	remaining: 3.81s
71:	learn: 0.0115788	total: 8.27s	remaining: 3.67s
72:	learn: 0.0113241	total: 8.38s	remaining: 3.56s
73:	learn: 0.0112150	total: 8.46s	remaining: 3.43s
74:	learn: 0.0109401	total: 8.55s	remaining: 3.31s
75:	learn: 0.0107244	total: 8.65s	remaining: 3.19s
76:	learn: 0.0104670	total: 8.74s	remaining: 3.06s
77:	learn: 0.0103132	total: 8.83s	remaining: 2.94s
78:	learn: 0.0101957	total: 8.93s	remaining: 2.83s
79:	learn: 0.0100315	total: 9.03s	remaining: 2.71s
80:	learn: 0.0098545	total: 9.12s	remaining: 2.59s
81:	learn: 0.0097336	total: 9.21s	remaining: 2.47s
82:	learn: 0.0095562	total: 9.3s	remaining: 2.35s
83:	learn: 0.0093769	total: 9.39s	remaining: 2.24s
84:	learn: 0.0092988	total: 9.49s	remaining: 2.12s
85:	learn: 0.0091023	total: 9.58s	remaining: 2s
86:	learn: 0.0089414	total: 9.66s	remaining: 1.89s
87:	learn: 0.0088534	total: 9.76s	remaining: 1.77s
88:	learn: 0.0086328	total: 9.85s	remaining: 1.66s
89:	learn: 0.0084628	total: 9.94s	remaining: 1.55s
90:	learn: 0.0083723	total: 10.1s	remaining: 1.44s
91:	learn: 0.0082550	total: 10.1s	remaining: 1.32s
92:	learn: 0.0081393	total: 10.2s	remaining: 1.21s
93:	learn: 0.0080404	total: 10.3s	remaining: 1.1s
94:	learn: 0.0079473	total: 10.4s	remaining: 989ms
95:	learn: 0.0077966	total: 10.5s	remaining: 878ms
96:	learn: 0.0076383	total: 10.6s	remaining: 768ms
97:	learn: 0.0075004	total: 10.7s	remaining: 657ms
98:	learn: 0.0074451	total: 10.8s	remaining: 546ms
99:	learn: 0.0073203	total: 10.9s	remaining: 436ms
100:	learn: 0.0071769	total: 11s	remaining: 327ms
101:	learn: 0.0070181	total: 11.1s	remaining: 218ms
102:	learn: 0.0069259	total: 11.2s	remaining: 109ms
103:	learn: 0.0067799	total: 11.3s	remaining: 0us
[I 2024-12-19 14:21:51,819] Trial 13 finished with value: 73.57994734889388 and parameters: {'learning_rate': 0.06611682054595477, 'max_depth': 6, 'n_estimators': 104, 'scale_pos_weight': 11.721329337016055}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.37
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.17
 - F1-Score_Train: 99.18
 - Precision_Test: 8.18
 - Recall_Test: 88.89
 - AUPRC_Test: 72.18
 - Accuracy_Test: 98.30
 - F1-Score_Test: 14.97
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 104
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.72
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 73.5799

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5740955	total: 154ms	remaining: 33.1s
1:	learn: 0.4734523	total: 328ms	remaining: 35.1s
2:	learn: 0.3974636	total: 510ms	remaining: 36.2s
3:	learn: 0.3214541	total: 699ms	remaining: 37s
4:	learn: 0.2627932	total: 893ms	remaining: 37.7s
5:	learn: 0.2137600	total: 1.08s	remaining: 37.9s
6:	learn: 0.1799575	total: 1.26s	remaining: 37.8s
7:	learn: 0.1536607	total: 1.43s	remaining: 37.2s
8:	learn: 0.1367628	total: 1.61s	remaining: 37s
9:	learn: 0.1238924	total: 1.78s	remaining: 36.7s
10:	learn: 0.1109998	total: 1.99s	remaining: 37.1s
11:	learn: 0.0986902	total: 2.17s	remaining: 36.8s
12:	learn: 0.0919387	total: 2.31s	remaining: 36.1s
13:	learn: 0.0871548	total: 2.48s	remaining: 35.9s
14:	learn: 0.0835257	total: 2.67s	remaining: 35.8s
15:	learn: 0.0768926	total: 2.85s	remaining: 35.7s
16:	learn: 0.0700993	total: 3.05s	remaining: 35.8s
17:	learn: 0.0674442	total: 3.22s	remaining: 35.5s
18:	learn: 0.0632792	total: 3.4s	remaining: 35.3s
19:	learn: 0.0606749	total: 3.57s	remaining: 35s
20:	learn: 0.0559231	total: 3.77s	remaining: 35s
21:	learn: 0.0541899	total: 3.94s	remaining: 34.7s
22:	learn: 0.0511098	total: 4.13s	remaining: 34.6s
23:	learn: 0.0486072	total: 4.31s	remaining: 34.5s
24:	learn: 0.0462717	total: 4.5s	remaining: 34.3s
25:	learn: 0.0447844	total: 4.66s	remaining: 34.1s
26:	learn: 0.0432718	total: 4.86s	remaining: 34s
27:	learn: 0.0420340	total: 5.02s	remaining: 33.7s
28:	learn: 0.0401540	total: 5.12s	remaining: 33s
29:	learn: 0.0388734	total: 5.21s	remaining: 32.3s
30:	learn: 0.0379480	total: 5.29s	remaining: 31.6s
31:	learn: 0.0366231	total: 5.4s	remaining: 31.1s
32:	learn: 0.0356335	total: 5.5s	remaining: 30.5s
33:	learn: 0.0344332	total: 5.59s	remaining: 29.9s
34:	learn: 0.0334909	total: 5.69s	remaining: 29.4s
35:	learn: 0.0327713	total: 5.79s	remaining: 28.9s
36:	learn: 0.0318209	total: 5.88s	remaining: 28.4s
37:	learn: 0.0312388	total: 5.97s	remaining: 28s
38:	learn: 0.0305811	total: 6.07s	remaining: 27.6s
39:	learn: 0.0296199	total: 6.16s	remaining: 27.1s
40:	learn: 0.0288319	total: 6.26s	remaining: 26.7s
41:	learn: 0.0282348	total: 6.35s	remaining: 26.3s
42:	learn: 0.0276979	total: 6.44s	remaining: 25.9s
43:	learn: 0.0271645	total: 6.54s	remaining: 25.6s
44:	learn: 0.0264752	total: 6.63s	remaining: 25.2s
45:	learn: 0.0258704	total: 6.72s	remaining: 24.8s
46:	learn: 0.0253784	total: 6.8s	remaining: 24.5s
47:	learn: 0.0248231	total: 6.9s	remaining: 24.1s
48:	learn: 0.0244090	total: 6.98s	remaining: 23.8s
49:	learn: 0.0238408	total: 7.1s	remaining: 23.6s
50:	learn: 0.0234049	total: 7.18s	remaining: 23.2s
51:	learn: 0.0228625	total: 7.27s	remaining: 22.9s
52:	learn: 0.0223848	total: 7.35s	remaining: 22.6s
53:	learn: 0.0220098	total: 7.44s	remaining: 22.3s
54:	learn: 0.0215754	total: 7.53s	remaining: 22.1s
55:	learn: 0.0212183	total: 7.63s	remaining: 21.8s
56:	learn: 0.0208300	total: 7.72s	remaining: 21.5s
57:	learn: 0.0204392	total: 7.81s	remaining: 21.3s
58:	learn: 0.0200648	total: 7.91s	remaining: 21.1s
59:	learn: 0.0197645	total: 8s	remaining: 20.8s
60:	learn: 0.0195172	total: 8.1s	remaining: 20.6s
61:	learn: 0.0191537	total: 8.2s	remaining: 20.4s
62:	learn: 0.0186642	total: 8.3s	remaining: 20.2s
63:	learn: 0.0183393	total: 8.39s	remaining: 19.9s
64:	learn: 0.0180253	total: 8.51s	remaining: 19.8s
65:	learn: 0.0176670	total: 8.59s	remaining: 19.5s
66:	learn: 0.0173299	total: 8.68s	remaining: 19.3s
67:	learn: 0.0169946	total: 8.78s	remaining: 19.1s
68:	learn: 0.0166342	total: 8.87s	remaining: 18.9s
69:	learn: 0.0163646	total: 8.96s	remaining: 18.7s
70:	learn: 0.0160633	total: 9.07s	remaining: 18.5s
71:	learn: 0.0158155	total: 9.17s	remaining: 18.3s
72:	learn: 0.0155830	total: 9.26s	remaining: 18.1s
73:	learn: 0.0153115	total: 9.35s	remaining: 17.9s
74:	learn: 0.0150848	total: 9.44s	remaining: 17.7s
75:	learn: 0.0147573	total: 9.54s	remaining: 17.6s
76:	learn: 0.0144862	total: 9.64s	remaining: 17.4s
77:	learn: 0.0142477	total: 9.73s	remaining: 17.2s
78:	learn: 0.0140129	total: 9.81s	remaining: 17s
79:	learn: 0.0137889	total: 9.91s	remaining: 16.9s
80:	learn: 0.0135610	total: 10s	remaining: 16.7s
81:	learn: 0.0133337	total: 10.1s	remaining: 16.5s
82:	learn: 0.0131675	total: 10.2s	remaining: 16.4s
83:	learn: 0.0130253	total: 10.3s	remaining: 16.2s
84:	learn: 0.0128629	total: 10.4s	remaining: 16s
85:	learn: 0.0127340	total: 10.5s	remaining: 15.9s
86:	learn: 0.0125656	total: 10.6s	remaining: 15.7s
87:	learn: 0.0124181	total: 10.7s	remaining: 15.5s
88:	learn: 0.0122889	total: 10.8s	remaining: 15.4s
89:	learn: 0.0121610	total: 10.9s	remaining: 15.2s
90:	learn: 0.0120043	total: 10.9s	remaining: 15s
91:	learn: 0.0118834	total: 11s	remaining: 14.9s
92:	learn: 0.0117549	total: 11.1s	remaining: 14.7s
93:	learn: 0.0116668	total: 11.2s	remaining: 14.6s
94:	learn: 0.0115247	total: 11.3s	remaining: 14.4s
95:	learn: 0.0113403	total: 11.4s	remaining: 14.3s
96:	learn: 0.0112087	total: 11.5s	remaining: 14.1s
97:	learn: 0.0110562	total: 11.6s	remaining: 14s
98:	learn: 0.0108901	total: 11.7s	remaining: 13.8s
99:	learn: 0.0107389	total: 11.8s	remaining: 13.7s
100:	learn: 0.0105917	total: 11.9s	remaining: 13.5s
101:	learn: 0.0104686	total: 12s	remaining: 13.4s
102:	learn: 0.0103792	total: 12.1s	remaining: 13.2s
103:	learn: 0.0102961	total: 12.1s	remaining: 13.1s
104:	learn: 0.0101274	total: 12.3s	remaining: 13s
105:	learn: 0.0100243	total: 12.3s	remaining: 12.8s
106:	learn: 0.0099138	total: 12.4s	remaining: 12.7s
107:	learn: 0.0097801	total: 12.5s	remaining: 12.5s
108:	learn: 0.0096778	total: 12.6s	remaining: 12.4s
109:	learn: 0.0095762	total: 12.7s	remaining: 12.3s
110:	learn: 0.0095187	total: 12.8s	remaining: 12.1s
111:	learn: 0.0094244	total: 12.9s	remaining: 12s
112:	learn: 0.0093360	total: 13s	remaining: 11.8s
113:	learn: 0.0092545	total: 13.1s	remaining: 11.7s
114:	learn: 0.0091625	total: 13.2s	remaining: 11.6s
115:	learn: 0.0090569	total: 13.3s	remaining: 11.4s
116:	learn: 0.0089188	total: 13.4s	remaining: 11.3s
117:	learn: 0.0088357	total: 13.5s	remaining: 11.2s
118:	learn: 0.0087414	total: 13.6s	remaining: 11.1s
119:	learn: 0.0086325	total: 13.7s	remaining: 10.9s
120:	learn: 0.0085016	total: 13.7s	remaining: 10.8s
121:	learn: 0.0084171	total: 13.8s	remaining: 10.7s
122:	learn: 0.0083699	total: 13.9s	remaining: 10.5s
123:	learn: 0.0082853	total: 14s	remaining: 10.4s
124:	learn: 0.0081766	total: 14.1s	remaining: 10.3s
125:	learn: 0.0081064	total: 14.2s	remaining: 10.1s
126:	learn: 0.0080116	total: 14.3s	remaining: 10s
127:	learn: 0.0079264	total: 14.4s	remaining: 9.91s
128:	learn: 0.0078013	total: 14.5s	remaining: 9.79s
129:	learn: 0.0077540	total: 14.6s	remaining: 9.65s
130:	learn: 0.0076670	total: 14.7s	remaining: 9.53s
131:	learn: 0.0075474	total: 14.8s	remaining: 9.41s
132:	learn: 0.0074675	total: 14.9s	remaining: 9.28s
133:	learn: 0.0073777	total: 15s	remaining: 9.16s
134:	learn: 0.0073211	total: 15.1s	remaining: 9.06s
135:	learn: 0.0072759	total: 15.3s	remaining: 8.99s
136:	learn: 0.0072156	total: 15.5s	remaining: 8.91s
137:	learn: 0.0071555	total: 15.6s	remaining: 8.84s
138:	learn: 0.0071087	total: 15.8s	remaining: 8.75s
139:	learn: 0.0070548	total: 16s	remaining: 8.67s
140:	learn: 0.0069210	total: 16.2s	remaining: 8.59s
141:	learn: 0.0068505	total: 16.3s	remaining: 8.51s
142:	learn: 0.0068287	total: 16.5s	remaining: 8.41s
143:	learn: 0.0067370	total: 16.6s	remaining: 8.32s
144:	learn: 0.0066750	total: 16.8s	remaining: 8.23s
145:	learn: 0.0066315	total: 17s	remaining: 8.14s
146:	learn: 0.0065639	total: 17.2s	remaining: 8.05s
147:	learn: 0.0064608	total: 17.3s	remaining: 7.97s
148:	learn: 0.0064025	total: 17.5s	remaining: 7.88s
149:	learn: 0.0063569	total: 17.7s	remaining: 7.78s
150:	learn: 0.0063162	total: 17.9s	remaining: 7.68s
151:	learn: 0.0062715	total: 18s	remaining: 7.59s
152:	learn: 0.0062302	total: 18.2s	remaining: 7.5s
153:	learn: 0.0062045	total: 18.4s	remaining: 7.4s
154:	learn: 0.0061404	total: 18.6s	remaining: 7.3s
155:	learn: 0.0060968	total: 18.7s	remaining: 7.2s
156:	learn: 0.0060614	total: 18.9s	remaining: 7.1s
157:	learn: 0.0059996	total: 19.1s	remaining: 7s
158:	learn: 0.0059364	total: 19.2s	remaining: 6.9s
159:	learn: 0.0058360	total: 19.4s	remaining: 6.8s
160:	learn: 0.0058064	total: 19.6s	remaining: 6.69s
161:	learn: 0.0057571	total: 19.8s	remaining: 6.59s
162:	learn: 0.0056737	total: 20s	remaining: 6.49s
163:	learn: 0.0056469	total: 20.1s	remaining: 6.38s
164:	learn: 0.0055597	total: 20.3s	remaining: 6.28s
165:	learn: 0.0055077	total: 20.5s	remaining: 6.16s
166:	learn: 0.0054600	total: 20.5s	remaining: 6.03s
167:	learn: 0.0054307	total: 20.6s	remaining: 5.89s
168:	learn: 0.0053760	total: 20.7s	remaining: 5.77s
169:	learn: 0.0053377	total: 20.8s	remaining: 5.63s
170:	learn: 0.0052949	total: 20.9s	remaining: 5.5s
171:	learn: 0.0052364	total: 21s	remaining: 5.38s
172:	learn: 0.0052072	total: 21.1s	remaining: 5.25s
173:	learn: 0.0051752	total: 21.2s	remaining: 5.12s
174:	learn: 0.0051268	total: 21.3s	remaining: 4.99s
175:	learn: 0.0050903	total: 21.4s	remaining: 4.86s
176:	learn: 0.0050700	total: 21.5s	remaining: 4.73s
177:	learn: 0.0050393	total: 21.6s	remaining: 4.61s
178:	learn: 0.0049879	total: 21.7s	remaining: 4.48s
179:	learn: 0.0049661	total: 21.7s	remaining: 4.35s
180:	learn: 0.0048998	total: 21.9s	remaining: 4.22s
181:	learn: 0.0048734	total: 21.9s	remaining: 4.1s
182:	learn: 0.0048532	total: 22s	remaining: 3.97s
183:	learn: 0.0048095	total: 22.1s	remaining: 3.85s
184:	learn: 0.0047831	total: 22.2s	remaining: 3.72s
185:	learn: 0.0047546	total: 22.3s	remaining: 3.59s
186:	learn: 0.0046968	total: 22.4s	remaining: 3.47s
187:	learn: 0.0046433	total: 22.5s	remaining: 3.35s
188:	learn: 0.0046328	total: 22.6s	remaining: 3.22s
189:	learn: 0.0046328	total: 22.6s	remaining: 3.1s
190:	learn: 0.0045666	total: 22.8s	remaining: 2.98s
191:	learn: 0.0045666	total: 22.8s	remaining: 2.85s
192:	learn: 0.0045236	total: 22.9s	remaining: 2.73s
193:	learn: 0.0044859	total: 23s	remaining: 2.61s
194:	learn: 0.0044624	total: 23.1s	remaining: 2.49s
195:	learn: 0.0044339	total: 23.2s	remaining: 2.37s
196:	learn: 0.0043913	total: 23.3s	remaining: 2.25s
197:	learn: 0.0043559	total: 23.4s	remaining: 2.13s
198:	learn: 0.0042993	total: 23.5s	remaining: 2.01s
199:	learn: 0.0042776	total: 23.6s	remaining: 1.89s
200:	learn: 0.0042519	total: 23.7s	remaining: 1.77s
201:	learn: 0.0042519	total: 23.8s	remaining: 1.65s
202:	learn: 0.0042031	total: 23.9s	remaining: 1.53s
203:	learn: 0.0041786	total: 24s	remaining: 1.41s
204:	learn: 0.0041229	total: 24.1s	remaining: 1.29s
205:	learn: 0.0040859	total: 24.2s	remaining: 1.17s
206:	learn: 0.0040399	total: 24.2s	remaining: 1.05s
207:	learn: 0.0040278	total: 24.3s	remaining: 936ms
208:	learn: 0.0040015	total: 24.4s	remaining: 818ms
209:	learn: 0.0039749	total: 24.5s	remaining: 701ms
210:	learn: 0.0039499	total: 24.6s	remaining: 583ms
211:	learn: 0.0039258	total: 24.7s	remaining: 466ms
212:	learn: 0.0039187	total: 24.8s	remaining: 349ms
213:	learn: 0.0039187	total: 24.9s	remaining: 233ms
214:	learn: 0.0038607	total: 25s	remaining: 116ms
215:	learn: 0.0038413	total: 25.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.44
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.72
 - F1-Score_Train: 99.72
 - Precision_Test: 17.43
 - Recall_Test: 87.30
 - AUPRC_Test: 79.10
 - Accuracy_Test: 99.28
 - F1-Score_Test: 29.06
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 216
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.37
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5810025	total: 86.3ms	remaining: 18.6s
1:	learn: 0.4776675	total: 176ms	remaining: 18.8s
2:	learn: 0.4126022	total: 269ms	remaining: 19.1s
3:	learn: 0.3449634	total: 372ms	remaining: 19.7s
4:	learn: 0.2855179	total: 465ms	remaining: 19.6s
5:	learn: 0.2461570	total: 553ms	remaining: 19.4s
6:	learn: 0.2142606	total: 683ms	remaining: 20.4s
7:	learn: 0.1958316	total: 774ms	remaining: 20.1s
8:	learn: 0.1765950	total: 871ms	remaining: 20s
9:	learn: 0.1602965	total: 969ms	remaining: 20s
10:	learn: 0.1447861	total: 1.07s	remaining: 19.9s
11:	learn: 0.1362157	total: 1.15s	remaining: 19.5s
12:	learn: 0.1248545	total: 1.26s	remaining: 19.7s
13:	learn: 0.1163322	total: 1.35s	remaining: 19.5s
14:	learn: 0.1075154	total: 1.45s	remaining: 19.4s
15:	learn: 0.1026679	total: 1.55s	remaining: 19.3s
16:	learn: 0.0969204	total: 1.64s	remaining: 19.2s
17:	learn: 0.0927111	total: 1.74s	remaining: 19.2s
18:	learn: 0.0882122	total: 1.85s	remaining: 19.2s
19:	learn: 0.0836962	total: 1.95s	remaining: 19.1s
20:	learn: 0.0789976	total: 2.04s	remaining: 19s
21:	learn: 0.0762042	total: 2.14s	remaining: 18.9s
22:	learn: 0.0720570	total: 2.26s	remaining: 19s
23:	learn: 0.0705305	total: 2.33s	remaining: 18.7s
24:	learn: 0.0689583	total: 2.43s	remaining: 18.6s
25:	learn: 0.0663956	total: 2.52s	remaining: 18.5s
26:	learn: 0.0642049	total: 2.61s	remaining: 18.3s
27:	learn: 0.0623080	total: 2.73s	remaining: 18.3s
28:	learn: 0.0604939	total: 2.82s	remaining: 18.2s
29:	learn: 0.0585917	total: 2.91s	remaining: 18.1s
30:	learn: 0.0568491	total: 3.03s	remaining: 18.1s
31:	learn: 0.0547067	total: 3.12s	remaining: 18s
32:	learn: 0.0529125	total: 3.27s	remaining: 18.1s
33:	learn: 0.0512095	total: 3.42s	remaining: 18.3s
34:	learn: 0.0494911	total: 3.6s	remaining: 18.6s
35:	learn: 0.0483397	total: 3.77s	remaining: 18.9s
36:	learn: 0.0471759	total: 3.95s	remaining: 19.1s
37:	learn: 0.0462227	total: 4.13s	remaining: 19.4s
38:	learn: 0.0444605	total: 4.32s	remaining: 19.6s
39:	learn: 0.0436146	total: 4.49s	remaining: 19.8s
40:	learn: 0.0425851	total: 4.68s	remaining: 20s
41:	learn: 0.0414543	total: 4.85s	remaining: 20.1s
42:	learn: 0.0408056	total: 5.02s	remaining: 20.2s
43:	learn: 0.0398924	total: 5.2s	remaining: 20.3s
44:	learn: 0.0389834	total: 5.37s	remaining: 20.4s
45:	learn: 0.0383249	total: 5.54s	remaining: 20.5s
46:	learn: 0.0376101	total: 5.71s	remaining: 20.5s
47:	learn: 0.0371834	total: 5.89s	remaining: 20.6s
48:	learn: 0.0361058	total: 6.09s	remaining: 20.7s
49:	learn: 0.0353608	total: 6.25s	remaining: 20.8s
50:	learn: 0.0345417	total: 6.45s	remaining: 20.9s
51:	learn: 0.0338061	total: 6.63s	remaining: 20.9s
52:	learn: 0.0333078	total: 6.8s	remaining: 20.9s
53:	learn: 0.0327044	total: 6.97s	remaining: 20.9s
54:	learn: 0.0320816	total: 7.15s	remaining: 20.9s
55:	learn: 0.0313591	total: 7.33s	remaining: 20.9s
56:	learn: 0.0309696	total: 7.51s	remaining: 20.9s
57:	learn: 0.0302819	total: 7.69s	remaining: 20.9s
58:	learn: 0.0296720	total: 7.9s	remaining: 21s
59:	learn: 0.0291676	total: 8.03s	remaining: 20.9s
60:	learn: 0.0287177	total: 8.12s	remaining: 20.6s
61:	learn: 0.0283196	total: 8.21s	remaining: 20.4s
62:	learn: 0.0278805	total: 8.29s	remaining: 20.1s
63:	learn: 0.0273295	total: 8.39s	remaining: 19.9s
64:	learn: 0.0267270	total: 8.49s	remaining: 19.7s
65:	learn: 0.0262938	total: 8.58s	remaining: 19.5s
66:	learn: 0.0259695	total: 8.69s	remaining: 19.3s
67:	learn: 0.0255703	total: 8.78s	remaining: 19.1s
68:	learn: 0.0248936	total: 8.87s	remaining: 18.9s
69:	learn: 0.0245231	total: 8.98s	remaining: 18.7s
70:	learn: 0.0240718	total: 9.08s	remaining: 18.5s
71:	learn: 0.0236232	total: 9.18s	remaining: 18.4s
72:	learn: 0.0232769	total: 9.28s	remaining: 18.2s
73:	learn: 0.0230620	total: 9.37s	remaining: 18s
74:	learn: 0.0228292	total: 9.45s	remaining: 17.8s
75:	learn: 0.0224565	total: 9.55s	remaining: 17.6s
76:	learn: 0.0220127	total: 9.64s	remaining: 17.4s
77:	learn: 0.0218388	total: 9.72s	remaining: 17.2s
78:	learn: 0.0215274	total: 9.83s	remaining: 17s
79:	learn: 0.0211295	total: 9.92s	remaining: 16.9s
80:	learn: 0.0206708	total: 10s	remaining: 16.7s
81:	learn: 0.0202287	total: 10.1s	remaining: 16.5s
82:	learn: 0.0199824	total: 10.2s	remaining: 16.4s
83:	learn: 0.0197675	total: 10.3s	remaining: 16.2s
84:	learn: 0.0196392	total: 10.4s	remaining: 16s
85:	learn: 0.0192952	total: 10.5s	remaining: 15.9s
86:	learn: 0.0191085	total: 10.6s	remaining: 15.7s
87:	learn: 0.0187717	total: 10.7s	remaining: 15.5s
88:	learn: 0.0185015	total: 10.8s	remaining: 15.4s
89:	learn: 0.0182329	total: 10.9s	remaining: 15.2s
90:	learn: 0.0178948	total: 11s	remaining: 15.1s
91:	learn: 0.0176626	total: 11.1s	remaining: 14.9s
92:	learn: 0.0174985	total: 11.2s	remaining: 14.8s
93:	learn: 0.0172369	total: 11.3s	remaining: 14.6s
94:	learn: 0.0168700	total: 11.4s	remaining: 14.5s
95:	learn: 0.0166070	total: 11.4s	remaining: 14.3s
96:	learn: 0.0164839	total: 11.5s	remaining: 14.2s
97:	learn: 0.0162109	total: 11.6s	remaining: 14s
98:	learn: 0.0160397	total: 11.7s	remaining: 13.8s
99:	learn: 0.0158456	total: 11.8s	remaining: 13.7s
100:	learn: 0.0156162	total: 11.9s	remaining: 13.6s
101:	learn: 0.0155007	total: 12s	remaining: 13.4s
102:	learn: 0.0153009	total: 12.1s	remaining: 13.3s
103:	learn: 0.0151696	total: 12.2s	remaining: 13.1s
104:	learn: 0.0149983	total: 12.3s	remaining: 13s
105:	learn: 0.0147787	total: 12.4s	remaining: 12.9s
106:	learn: 0.0145252	total: 12.5s	remaining: 12.7s
107:	learn: 0.0143744	total: 12.6s	remaining: 12.6s
108:	learn: 0.0142931	total: 12.7s	remaining: 12.4s
109:	learn: 0.0141072	total: 12.8s	remaining: 12.3s
110:	learn: 0.0138564	total: 12.9s	remaining: 12.2s
111:	learn: 0.0136486	total: 13s	remaining: 12s
112:	learn: 0.0135706	total: 13s	remaining: 11.9s
113:	learn: 0.0134694	total: 13.1s	remaining: 11.8s
114:	learn: 0.0132688	total: 13.2s	remaining: 11.6s
115:	learn: 0.0131269	total: 13.3s	remaining: 11.5s
116:	learn: 0.0129793	total: 13.4s	remaining: 11.4s
117:	learn: 0.0128528	total: 13.5s	remaining: 11.2s
118:	learn: 0.0127163	total: 13.6s	remaining: 11.1s
119:	learn: 0.0125370	total: 13.7s	remaining: 11s
120:	learn: 0.0123507	total: 13.8s	remaining: 10.8s
121:	learn: 0.0122308	total: 13.9s	remaining: 10.7s
122:	learn: 0.0120775	total: 14s	remaining: 10.6s
123:	learn: 0.0119805	total: 14.1s	remaining: 10.5s
124:	learn: 0.0119016	total: 14.2s	remaining: 10.3s
125:	learn: 0.0117786	total: 14.3s	remaining: 10.2s
126:	learn: 0.0116538	total: 14.4s	remaining: 10.1s
127:	learn: 0.0115701	total: 14.5s	remaining: 9.94s
128:	learn: 0.0114697	total: 14.5s	remaining: 9.81s
129:	learn: 0.0113519	total: 14.6s	remaining: 9.68s
130:	learn: 0.0112701	total: 14.7s	remaining: 9.55s
131:	learn: 0.0111103	total: 14.8s	remaining: 9.43s
132:	learn: 0.0110327	total: 14.9s	remaining: 9.3s
133:	learn: 0.0109021	total: 15s	remaining: 9.18s
134:	learn: 0.0108261	total: 15.1s	remaining: 9.06s
135:	learn: 0.0106815	total: 15.2s	remaining: 8.96s
136:	learn: 0.0106264	total: 15.3s	remaining: 8.82s
137:	learn: 0.0105182	total: 15.4s	remaining: 8.7s
138:	learn: 0.0103786	total: 15.5s	remaining: 8.58s
139:	learn: 0.0102671	total: 15.6s	remaining: 8.46s
140:	learn: 0.0101711	total: 15.7s	remaining: 8.33s
141:	learn: 0.0100822	total: 15.8s	remaining: 8.22s
142:	learn: 0.0099617	total: 15.9s	remaining: 8.1s
143:	learn: 0.0098545	total: 16s	remaining: 7.98s
144:	learn: 0.0097189	total: 16.1s	remaining: 7.86s
145:	learn: 0.0096171	total: 16.1s	remaining: 7.74s
146:	learn: 0.0095531	total: 16.2s	remaining: 7.63s
147:	learn: 0.0094798	total: 16.4s	remaining: 7.51s
148:	learn: 0.0093871	total: 16.4s	remaining: 7.39s
149:	learn: 0.0093171	total: 16.5s	remaining: 7.27s
150:	learn: 0.0092607	total: 16.6s	remaining: 7.15s
151:	learn: 0.0091841	total: 16.7s	remaining: 7.03s
152:	learn: 0.0091057	total: 16.8s	remaining: 6.92s
153:	learn: 0.0090009	total: 16.9s	remaining: 6.8s
154:	learn: 0.0089136	total: 17s	remaining: 6.69s
155:	learn: 0.0088608	total: 17.1s	remaining: 6.57s
156:	learn: 0.0088303	total: 17.2s	remaining: 6.45s
157:	learn: 0.0087785	total: 17.3s	remaining: 6.34s
158:	learn: 0.0086435	total: 17.4s	remaining: 6.22s
159:	learn: 0.0085568	total: 17.5s	remaining: 6.11s
160:	learn: 0.0084842	total: 17.5s	remaining: 5.99s
161:	learn: 0.0084138	total: 17.6s	remaining: 5.88s
162:	learn: 0.0083820	total: 17.7s	remaining: 5.76s
163:	learn: 0.0083357	total: 17.8s	remaining: 5.65s
164:	learn: 0.0083107	total: 17.9s	remaining: 5.53s
165:	learn: 0.0082303	total: 18s	remaining: 5.43s
166:	learn: 0.0081536	total: 18.2s	remaining: 5.33s
167:	learn: 0.0081059	total: 18.3s	remaining: 5.23s
168:	learn: 0.0080003	total: 18.5s	remaining: 5.14s
169:	learn: 0.0079223	total: 18.7s	remaining: 5.05s
170:	learn: 0.0078876	total: 18.8s	remaining: 4.95s
171:	learn: 0.0078487	total: 19s	remaining: 4.86s
172:	learn: 0.0077758	total: 19.2s	remaining: 4.77s
173:	learn: 0.0076907	total: 19.4s	remaining: 4.67s
174:	learn: 0.0076164	total: 19.5s	remaining: 4.58s
175:	learn: 0.0075248	total: 19.7s	remaining: 4.48s
176:	learn: 0.0074805	total: 19.9s	remaining: 4.38s
177:	learn: 0.0074264	total: 20.1s	remaining: 4.28s
178:	learn: 0.0073928	total: 20.2s	remaining: 4.18s
179:	learn: 0.0073042	total: 20.4s	remaining: 4.08s
180:	learn: 0.0072480	total: 20.6s	remaining: 3.98s
181:	learn: 0.0071737	total: 20.8s	remaining: 3.88s
182:	learn: 0.0071138	total: 20.9s	remaining: 3.78s
183:	learn: 0.0070631	total: 21.1s	remaining: 3.67s
184:	learn: 0.0069841	total: 21.3s	remaining: 3.57s
185:	learn: 0.0069232	total: 21.5s	remaining: 3.46s
186:	learn: 0.0068569	total: 21.6s	remaining: 3.36s
187:	learn: 0.0068346	total: 21.8s	remaining: 3.25s
188:	learn: 0.0067565	total: 22s	remaining: 3.14s
189:	learn: 0.0067379	total: 22.1s	remaining: 3.03s
190:	learn: 0.0067129	total: 22.3s	remaining: 2.92s
191:	learn: 0.0066480	total: 22.5s	remaining: 2.81s
192:	learn: 0.0065652	total: 22.7s	remaining: 2.7s
193:	learn: 0.0065320	total: 22.8s	remaining: 2.58s
194:	learn: 0.0064848	total: 22.9s	remaining: 2.46s
195:	learn: 0.0064375	total: 23s	remaining: 2.34s
196:	learn: 0.0063848	total: 23.1s	remaining: 2.22s
197:	learn: 0.0063583	total: 23.2s	remaining: 2.1s
198:	learn: 0.0063231	total: 23.2s	remaining: 1.99s
199:	learn: 0.0062587	total: 23.3s	remaining: 1.87s
200:	learn: 0.0062378	total: 23.4s	remaining: 1.75s
201:	learn: 0.0061607	total: 23.5s	remaining: 1.63s
202:	learn: 0.0061463	total: 23.6s	remaining: 1.51s
203:	learn: 0.0061145	total: 23.7s	remaining: 1.39s
204:	learn: 0.0060491	total: 23.8s	remaining: 1.28s
205:	learn: 0.0060269	total: 23.9s	remaining: 1.16s
206:	learn: 0.0060001	total: 24s	remaining: 1.04s
207:	learn: 0.0059563	total: 24.1s	remaining: 925ms
208:	learn: 0.0059160	total: 24.2s	remaining: 809ms
209:	learn: 0.0058863	total: 24.2s	remaining: 693ms
210:	learn: 0.0058651	total: 24.3s	remaining: 577ms
211:	learn: 0.0058165	total: 24.4s	remaining: 461ms
212:	learn: 0.0057993	total: 24.5s	remaining: 345ms
213:	learn: 0.0057659	total: 24.6s	remaining: 230ms
214:	learn: 0.0057260	total: 24.7s	remaining: 115ms
215:	learn: 0.0056770	total: 24.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.19
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.59
 - F1-Score_Train: 99.59
 - Precision_Test: 15.16
 - Recall_Test: 88.89
 - AUPRC_Test: 71.93
 - Accuracy_Test: 99.14
 - F1-Score_Test: 25.90
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 216
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.37
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5797772	total: 87.1ms	remaining: 18.7s
1:	learn: 0.4854029	total: 177ms	remaining: 19s
2:	learn: 0.3945383	total: 271ms	remaining: 19.3s
3:	learn: 0.3227531	total: 372ms	remaining: 19.7s
4:	learn: 0.2762917	total: 463ms	remaining: 19.5s
5:	learn: 0.2396950	total: 545ms	remaining: 19.1s
6:	learn: 0.2092467	total: 638ms	remaining: 19s
7:	learn: 0.1852017	total: 738ms	remaining: 19.2s
8:	learn: 0.1648611	total: 827ms	remaining: 19s
9:	learn: 0.1479806	total: 929ms	remaining: 19.1s
10:	learn: 0.1332611	total: 1.03s	remaining: 19.1s
11:	learn: 0.1223622	total: 1.11s	remaining: 18.9s
12:	learn: 0.1094420	total: 1.22s	remaining: 19.1s
13:	learn: 0.0986671	total: 1.32s	remaining: 19s
14:	learn: 0.0931660	total: 1.4s	remaining: 18.8s
15:	learn: 0.0869754	total: 1.51s	remaining: 18.9s
16:	learn: 0.0818847	total: 1.6s	remaining: 18.8s
17:	learn: 0.0782567	total: 1.69s	remaining: 18.6s
18:	learn: 0.0743235	total: 1.8s	remaining: 18.7s
19:	learn: 0.0706117	total: 1.89s	remaining: 18.5s
20:	learn: 0.0675895	total: 1.97s	remaining: 18.3s
21:	learn: 0.0644198	total: 2.08s	remaining: 18.4s
22:	learn: 0.0616720	total: 2.17s	remaining: 18.2s
23:	learn: 0.0597881	total: 2.26s	remaining: 18.1s
24:	learn: 0.0579412	total: 2.36s	remaining: 18.1s
25:	learn: 0.0557211	total: 2.46s	remaining: 18s
26:	learn: 0.0532028	total: 2.56s	remaining: 17.9s
27:	learn: 0.0515975	total: 2.65s	remaining: 17.8s
28:	learn: 0.0497395	total: 2.75s	remaining: 17.7s
29:	learn: 0.0482128	total: 2.9s	remaining: 18s
30:	learn: 0.0467585	total: 2.99s	remaining: 17.9s
31:	learn: 0.0454238	total: 3.08s	remaining: 17.7s
32:	learn: 0.0436051	total: 3.19s	remaining: 17.7s
33:	learn: 0.0422235	total: 3.29s	remaining: 17.6s
34:	learn: 0.0412141	total: 3.38s	remaining: 17.5s
35:	learn: 0.0398693	total: 3.49s	remaining: 17.5s
36:	learn: 0.0388960	total: 3.58s	remaining: 17.3s
37:	learn: 0.0379027	total: 3.68s	remaining: 17.2s
38:	learn: 0.0368795	total: 3.78s	remaining: 17.2s
39:	learn: 0.0360045	total: 3.89s	remaining: 17.1s
40:	learn: 0.0353371	total: 3.97s	remaining: 17s
41:	learn: 0.0344486	total: 4.08s	remaining: 16.9s
42:	learn: 0.0334951	total: 4.17s	remaining: 16.8s
43:	learn: 0.0326091	total: 4.28s	remaining: 16.7s
44:	learn: 0.0320792	total: 4.38s	remaining: 16.7s
45:	learn: 0.0314350	total: 4.47s	remaining: 16.5s
46:	learn: 0.0306983	total: 4.56s	remaining: 16.4s
47:	learn: 0.0299030	total: 4.66s	remaining: 16.3s
48:	learn: 0.0293910	total: 4.75s	remaining: 16.2s
49:	learn: 0.0289998	total: 4.84s	remaining: 16.1s
50:	learn: 0.0281457	total: 4.96s	remaining: 16s
51:	learn: 0.0275360	total: 5.06s	remaining: 16s
52:	learn: 0.0270203	total: 5.15s	remaining: 15.8s
53:	learn: 0.0264559	total: 5.28s	remaining: 15.8s
54:	learn: 0.0259912	total: 5.35s	remaining: 15.7s
55:	learn: 0.0256274	total: 5.44s	remaining: 15.6s
56:	learn: 0.0252812	total: 5.54s	remaining: 15.5s
57:	learn: 0.0249833	total: 5.62s	remaining: 15.3s
58:	learn: 0.0246533	total: 5.72s	remaining: 15.2s
59:	learn: 0.0243278	total: 5.82s	remaining: 15.1s
60:	learn: 0.0239947	total: 5.99s	remaining: 15.2s
61:	learn: 0.0235951	total: 6.16s	remaining: 15.3s
62:	learn: 0.0232745	total: 6.31s	remaining: 15.3s
63:	learn: 0.0228096	total: 6.51s	remaining: 15.5s
64:	learn: 0.0224026	total: 6.69s	remaining: 15.5s
65:	learn: 0.0218971	total: 6.87s	remaining: 15.6s
66:	learn: 0.0216375	total: 7.05s	remaining: 15.7s
67:	learn: 0.0213613	total: 7.22s	remaining: 15.7s
68:	learn: 0.0210432	total: 7.38s	remaining: 15.7s
69:	learn: 0.0206677	total: 7.55s	remaining: 15.8s
70:	learn: 0.0202465	total: 7.73s	remaining: 15.8s
71:	learn: 0.0199021	total: 7.91s	remaining: 15.8s
72:	learn: 0.0197154	total: 8.09s	remaining: 15.8s
73:	learn: 0.0193745	total: 8.28s	remaining: 15.9s
74:	learn: 0.0192084	total: 8.43s	remaining: 15.9s
75:	learn: 0.0189739	total: 8.61s	remaining: 15.9s
76:	learn: 0.0187469	total: 8.79s	remaining: 15.9s
77:	learn: 0.0185470	total: 8.96s	remaining: 15.9s
78:	learn: 0.0182658	total: 9.14s	remaining: 15.8s
79:	learn: 0.0180914	total: 9.32s	remaining: 15.8s
80:	learn: 0.0177670	total: 9.49s	remaining: 15.8s
81:	learn: 0.0175628	total: 9.63s	remaining: 15.7s
82:	learn: 0.0174366	total: 9.78s	remaining: 15.7s
83:	learn: 0.0171937	total: 9.97s	remaining: 15.7s
84:	learn: 0.0169954	total: 10.2s	remaining: 15.7s
85:	learn: 0.0166716	total: 10.4s	remaining: 15.7s
86:	learn: 0.0164594	total: 10.5s	remaining: 15.6s
87:	learn: 0.0163215	total: 10.7s	remaining: 15.6s
88:	learn: 0.0160806	total: 10.9s	remaining: 15.6s
89:	learn: 0.0158206	total: 11.1s	remaining: 15.5s
90:	learn: 0.0155912	total: 11.3s	remaining: 15.5s
91:	learn: 0.0152868	total: 11.3s	remaining: 15.3s
92:	learn: 0.0150537	total: 11.4s	remaining: 15.1s
93:	learn: 0.0148373	total: 11.5s	remaining: 15s
94:	learn: 0.0147624	total: 11.6s	remaining: 14.8s
95:	learn: 0.0143960	total: 11.7s	remaining: 14.7s
96:	learn: 0.0142591	total: 11.8s	remaining: 14.5s
97:	learn: 0.0141955	total: 11.9s	remaining: 14.3s
98:	learn: 0.0140075	total: 12s	remaining: 14.2s
99:	learn: 0.0138103	total: 12.1s	remaining: 14s
100:	learn: 0.0136552	total: 12.2s	remaining: 13.9s
101:	learn: 0.0134847	total: 12.3s	remaining: 13.8s
102:	learn: 0.0134132	total: 12.4s	remaining: 13.6s
103:	learn: 0.0133224	total: 12.5s	remaining: 13.4s
104:	learn: 0.0131882	total: 12.6s	remaining: 13.3s
105:	learn: 0.0130105	total: 12.7s	remaining: 13.1s
106:	learn: 0.0128091	total: 12.8s	remaining: 13s
107:	learn: 0.0127623	total: 12.8s	remaining: 12.8s
108:	learn: 0.0125504	total: 12.9s	remaining: 12.7s
109:	learn: 0.0124119	total: 13s	remaining: 12.6s
110:	learn: 0.0123252	total: 13.1s	remaining: 12.4s
111:	learn: 0.0121623	total: 13.2s	remaining: 12.3s
112:	learn: 0.0119356	total: 13.4s	remaining: 12.2s
113:	learn: 0.0116993	total: 13.4s	remaining: 12s
114:	learn: 0.0115535	total: 13.6s	remaining: 11.9s
115:	learn: 0.0114306	total: 13.6s	remaining: 11.8s
116:	learn: 0.0113028	total: 13.7s	remaining: 11.6s
117:	learn: 0.0112592	total: 13.8s	remaining: 11.5s
118:	learn: 0.0111416	total: 13.9s	remaining: 11.3s
119:	learn: 0.0110266	total: 14s	remaining: 11.2s
120:	learn: 0.0109169	total: 14.1s	remaining: 11.1s
121:	learn: 0.0107988	total: 14.2s	remaining: 10.9s
122:	learn: 0.0106512	total: 14.3s	remaining: 10.8s
123:	learn: 0.0104422	total: 14.4s	remaining: 10.7s
124:	learn: 0.0102908	total: 14.5s	remaining: 10.6s
125:	learn: 0.0101984	total: 14.6s	remaining: 10.4s
126:	learn: 0.0100604	total: 14.7s	remaining: 10.3s
127:	learn: 0.0099295	total: 14.8s	remaining: 10.2s
128:	learn: 0.0098613	total: 14.9s	remaining: 10s
129:	learn: 0.0097707	total: 15s	remaining: 9.91s
130:	learn: 0.0096602	total: 15.1s	remaining: 9.78s
131:	learn: 0.0094674	total: 15.2s	remaining: 9.66s
132:	learn: 0.0093228	total: 15.3s	remaining: 9.53s
133:	learn: 0.0092346	total: 15.4s	remaining: 9.41s
134:	learn: 0.0091734	total: 15.5s	remaining: 9.28s
135:	learn: 0.0091205	total: 15.6s	remaining: 9.15s
136:	learn: 0.0090726	total: 15.6s	remaining: 9.02s
137:	learn: 0.0089996	total: 15.7s	remaining: 8.9s
138:	learn: 0.0089634	total: 15.8s	remaining: 8.76s
139:	learn: 0.0088821	total: 15.9s	remaining: 8.64s
140:	learn: 0.0088137	total: 16s	remaining: 8.52s
141:	learn: 0.0087660	total: 16.1s	remaining: 8.4s
142:	learn: 0.0086833	total: 16.2s	remaining: 8.28s
143:	learn: 0.0086384	total: 16.3s	remaining: 8.15s
144:	learn: 0.0085373	total: 16.4s	remaining: 8.04s
145:	learn: 0.0083843	total: 16.5s	remaining: 7.91s
146:	learn: 0.0082733	total: 16.6s	remaining: 7.79s
147:	learn: 0.0081725	total: 16.7s	remaining: 7.67s
148:	learn: 0.0080945	total: 16.8s	remaining: 7.55s
149:	learn: 0.0079739	total: 16.9s	remaining: 7.44s
150:	learn: 0.0078996	total: 17s	remaining: 7.32s
151:	learn: 0.0077940	total: 17.1s	remaining: 7.19s
152:	learn: 0.0077194	total: 17.2s	remaining: 7.08s
153:	learn: 0.0076566	total: 17.3s	remaining: 6.95s
154:	learn: 0.0075881	total: 17.4s	remaining: 6.83s
155:	learn: 0.0075485	total: 17.5s	remaining: 6.72s
156:	learn: 0.0074657	total: 17.6s	remaining: 6.6s
157:	learn: 0.0074008	total: 17.6s	remaining: 6.47s
158:	learn: 0.0073317	total: 17.7s	remaining: 6.36s
159:	learn: 0.0072578	total: 17.8s	remaining: 6.24s
160:	learn: 0.0072023	total: 17.9s	remaining: 6.12s
161:	learn: 0.0071543	total: 18s	remaining: 6s
162:	learn: 0.0071226	total: 18.1s	remaining: 5.88s
163:	learn: 0.0070654	total: 18.2s	remaining: 5.76s
164:	learn: 0.0069923	total: 18.3s	remaining: 5.65s
165:	learn: 0.0069402	total: 18.4s	remaining: 5.53s
166:	learn: 0.0068724	total: 18.5s	remaining: 5.42s
167:	learn: 0.0068162	total: 18.6s	remaining: 5.31s
168:	learn: 0.0067246	total: 18.7s	remaining: 5.19s
169:	learn: 0.0066986	total: 18.8s	remaining: 5.08s
170:	learn: 0.0066495	total: 18.9s	remaining: 4.96s
171:	learn: 0.0065589	total: 19s	remaining: 4.85s
172:	learn: 0.0064989	total: 19s	remaining: 4.73s
173:	learn: 0.0064495	total: 19.1s	remaining: 4.62s
174:	learn: 0.0063777	total: 19.3s	remaining: 4.51s
175:	learn: 0.0062822	total: 19.3s	remaining: 4.4s
176:	learn: 0.0062606	total: 19.5s	remaining: 4.29s
177:	learn: 0.0062134	total: 19.5s	remaining: 4.17s
178:	learn: 0.0061794	total: 19.6s	remaining: 4.05s
179:	learn: 0.0061136	total: 19.7s	remaining: 3.94s
180:	learn: 0.0060588	total: 19.8s	remaining: 3.83s
181:	learn: 0.0059936	total: 19.9s	remaining: 3.71s
182:	learn: 0.0059474	total: 20s	remaining: 3.6s
183:	learn: 0.0058800	total: 20.1s	remaining: 3.49s
184:	learn: 0.0058362	total: 20.2s	remaining: 3.38s
185:	learn: 0.0058125	total: 20.3s	remaining: 3.27s
186:	learn: 0.0057523	total: 20.4s	remaining: 3.16s
187:	learn: 0.0057372	total: 20.4s	remaining: 3.04s
188:	learn: 0.0056991	total: 20.5s	remaining: 2.94s
189:	learn: 0.0056625	total: 20.6s	remaining: 2.82s
190:	learn: 0.0056405	total: 20.7s	remaining: 2.71s
191:	learn: 0.0056204	total: 20.8s	remaining: 2.6s
192:	learn: 0.0055920	total: 20.9s	remaining: 2.49s
193:	learn: 0.0055566	total: 21s	remaining: 2.38s
194:	learn: 0.0055315	total: 21.1s	remaining: 2.27s
195:	learn: 0.0054642	total: 21.2s	remaining: 2.16s
196:	learn: 0.0054157	total: 21.4s	remaining: 2.06s
197:	learn: 0.0053870	total: 21.5s	remaining: 1.95s
198:	learn: 0.0052976	total: 21.7s	remaining: 1.85s
199:	learn: 0.0052480	total: 21.8s	remaining: 1.75s
200:	learn: 0.0052243	total: 22s	remaining: 1.64s
201:	learn: 0.0052132	total: 22.2s	remaining: 1.54s
202:	learn: 0.0051845	total: 22.4s	remaining: 1.43s
203:	learn: 0.0051393	total: 22.5s	remaining: 1.33s
204:	learn: 0.0050845	total: 22.7s	remaining: 1.22s
205:	learn: 0.0050378	total: 22.9s	remaining: 1.11s
206:	learn: 0.0049978	total: 23s	remaining: 1s
207:	learn: 0.0049684	total: 23.2s	remaining: 892ms
208:	learn: 0.0049272	total: 23.4s	remaining: 782ms
209:	learn: 0.0048873	total: 23.5s	remaining: 672ms
210:	learn: 0.0048550	total: 23.7s	remaining: 561ms
211:	learn: 0.0048125	total: 23.8s	remaining: 450ms
212:	learn: 0.0047755	total: 24s	remaining: 338ms
213:	learn: 0.0047556	total: 24.2s	remaining: 226ms
214:	learn: 0.0047029	total: 24.4s	remaining: 113ms
215:	learn: 0.0046766	total: 24.5s	remaining: 0us
[I 2024-12-19 14:23:13,602] Trial 14 finished with value: 75.70747631678337 and parameters: {'learning_rate': 0.04518205664082046, 'max_depth': 6, 'n_estimators': 216, 'scale_pos_weight': 8.366830949370627}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.28
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.64
 - F1-Score_Train: 99.64
 - Precision_Test: 15.98
 - Recall_Test: 86.51
 - AUPRC_Test: 76.09
 - Accuracy_Test: 99.21
 - F1-Score_Test: 26.98
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 216
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.37
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 75.7075

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5991828	total: 63.3ms	remaining: 15.6s
1:	learn: 0.5220682	total: 128ms	remaining: 15.7s
2:	learn: 0.4554222	total: 194ms	remaining: 15.8s
3:	learn: 0.3944889	total: 259ms	remaining: 15.8s
4:	learn: 0.3505515	total: 342ms	remaining: 16.6s
5:	learn: 0.3081532	total: 411ms	remaining: 16.6s
6:	learn: 0.2669720	total: 491ms	remaining: 16.9s
7:	learn: 0.2418820	total: 569ms	remaining: 17.1s
8:	learn: 0.2205648	total: 631ms	remaining: 16.8s
9:	learn: 0.2005847	total: 702ms	remaining: 16.7s
10:	learn: 0.1832538	total: 772ms	remaining: 16.6s
11:	learn: 0.1760239	total: 843ms	remaining: 16.6s
12:	learn: 0.1634355	total: 908ms	remaining: 16.4s
13:	learn: 0.1533985	total: 967ms	remaining: 16.2s
14:	learn: 0.1492132	total: 1.04s	remaining: 16.1s
15:	learn: 0.1395071	total: 1.11s	remaining: 16s
16:	learn: 0.1309743	total: 1.18s	remaining: 16s
17:	learn: 0.1244640	total: 1.25s	remaining: 16s
18:	learn: 0.1210141	total: 1.32s	remaining: 16s
19:	learn: 0.1170816	total: 1.39s	remaining: 15.8s
20:	learn: 0.1137890	total: 1.46s	remaining: 15.8s
21:	learn: 0.1095195	total: 1.55s	remaining: 15.9s
22:	learn: 0.1076430	total: 1.61s	remaining: 15.7s
23:	learn: 0.1030903	total: 1.69s	remaining: 15.8s
24:	learn: 0.1006579	total: 1.76s	remaining: 15.7s
25:	learn: 0.0989152	total: 1.83s	remaining: 15.6s
26:	learn: 0.0968299	total: 1.88s	remaining: 15.4s
27:	learn: 0.0927796	total: 1.96s	remaining: 15.4s
28:	learn: 0.0915370	total: 2.03s	remaining: 15.3s
29:	learn: 0.0891113	total: 2.09s	remaining: 15.2s
30:	learn: 0.0870498	total: 2.15s	remaining: 15s
31:	learn: 0.0854474	total: 2.23s	remaining: 15.1s
32:	learn: 0.0841505	total: 2.29s	remaining: 14.9s
33:	learn: 0.0821198	total: 2.35s	remaining: 14.8s
34:	learn: 0.0805291	total: 2.42s	remaining: 14.7s
35:	learn: 0.0789249	total: 2.49s	remaining: 14.7s
36:	learn: 0.0777208	total: 2.57s	remaining: 14.7s
37:	learn: 0.0766889	total: 2.63s	remaining: 14.6s
38:	learn: 0.0754136	total: 2.71s	remaining: 14.5s
39:	learn: 0.0744783	total: 2.77s	remaining: 14.4s
40:	learn: 0.0729489	total: 2.84s	remaining: 14.3s
41:	learn: 0.0721510	total: 2.9s	remaining: 14.2s
42:	learn: 0.0711703	total: 2.98s	remaining: 14.2s
43:	learn: 0.0697382	total: 3.05s	remaining: 14.1s
44:	learn: 0.0690479	total: 3.11s	remaining: 14s
45:	learn: 0.0674785	total: 3.17s	remaining: 13.9s
46:	learn: 0.0659523	total: 3.25s	remaining: 13.9s
47:	learn: 0.0650565	total: 3.31s	remaining: 13.8s
48:	learn: 0.0641914	total: 3.37s	remaining: 13.7s
49:	learn: 0.0634892	total: 3.43s	remaining: 13.6s
50:	learn: 0.0621989	total: 3.51s	remaining: 13.6s
51:	learn: 0.0612475	total: 3.59s	remaining: 13.5s
52:	learn: 0.0606512	total: 3.66s	remaining: 13.5s
53:	learn: 0.0599065	total: 3.74s	remaining: 13.4s
54:	learn: 0.0591015	total: 3.8s	remaining: 13.3s
55:	learn: 0.0583705	total: 3.86s	remaining: 13.3s
56:	learn: 0.0577889	total: 3.93s	remaining: 13.2s
57:	learn: 0.0573042	total: 4s	remaining: 13.1s
58:	learn: 0.0564862	total: 4.07s	remaining: 13s
59:	learn: 0.0559253	total: 4.13s	remaining: 13s
60:	learn: 0.0551379	total: 4.2s	remaining: 12.9s
61:	learn: 0.0542738	total: 4.27s	remaining: 12.8s
62:	learn: 0.0536584	total: 4.34s	remaining: 12.7s
63:	learn: 0.0530663	total: 4.4s	remaining: 12.6s
64:	learn: 0.0525224	total: 4.46s	remaining: 12.6s
65:	learn: 0.0520912	total: 4.54s	remaining: 12.5s
66:	learn: 0.0513964	total: 4.61s	remaining: 12.4s
67:	learn: 0.0509349	total: 4.67s	remaining: 12.4s
68:	learn: 0.0503288	total: 4.75s	remaining: 12.3s
69:	learn: 0.0494928	total: 4.82s	remaining: 12.2s
70:	learn: 0.0487598	total: 4.87s	remaining: 12.2s
71:	learn: 0.0481722	total: 4.94s	remaining: 12.1s
72:	learn: 0.0475154	total: 5.01s	remaining: 12s
73:	learn: 0.0469683	total: 5.07s	remaining: 11.9s
74:	learn: 0.0464474	total: 5.13s	remaining: 11.8s
75:	learn: 0.0460379	total: 5.19s	remaining: 11.7s
76:	learn: 0.0455476	total: 5.26s	remaining: 11.7s
77:	learn: 0.0451706	total: 5.34s	remaining: 11.6s
78:	learn: 0.0446690	total: 5.43s	remaining: 11.6s
79:	learn: 0.0443458	total: 5.51s	remaining: 11.6s
80:	learn: 0.0439737	total: 5.57s	remaining: 11.5s
81:	learn: 0.0435062	total: 5.64s	remaining: 11.4s
82:	learn: 0.0432066	total: 5.73s	remaining: 11.4s
83:	learn: 0.0427948	total: 5.8s	remaining: 11.3s
84:	learn: 0.0425043	total: 5.86s	remaining: 11.2s
85:	learn: 0.0421962	total: 5.92s	remaining: 11.1s
86:	learn: 0.0417867	total: 6s	remaining: 11.1s
87:	learn: 0.0415430	total: 6.05s	remaining: 11s
88:	learn: 0.0412253	total: 6.12s	remaining: 10.9s
89:	learn: 0.0407491	total: 6.18s	remaining: 10.9s
90:	learn: 0.0403492	total: 6.26s	remaining: 10.8s
91:	learn: 0.0399915	total: 6.33s	remaining: 10.7s
92:	learn: 0.0397621	total: 6.39s	remaining: 10.7s
93:	learn: 0.0394246	total: 6.46s	remaining: 10.6s
94:	learn: 0.0391509	total: 6.53s	remaining: 10.5s
95:	learn: 0.0388421	total: 6.6s	remaining: 10.4s
96:	learn: 0.0385517	total: 6.68s	remaining: 10.4s
97:	learn: 0.0382376	total: 6.75s	remaining: 10.3s
98:	learn: 0.0379275	total: 6.82s	remaining: 10.3s
99:	learn: 0.0377315	total: 6.88s	remaining: 10.2s
100:	learn: 0.0375059	total: 6.94s	remaining: 10.1s
101:	learn: 0.0372271	total: 7.02s	remaining: 10s
102:	learn: 0.0369874	total: 7.08s	remaining: 9.97s
103:	learn: 0.0366753	total: 7.14s	remaining: 9.89s
104:	learn: 0.0363643	total: 7.21s	remaining: 9.82s
105:	learn: 0.0361182	total: 7.28s	remaining: 9.76s
106:	learn: 0.0359510	total: 7.34s	remaining: 9.68s
107:	learn: 0.0357588	total: 7.42s	remaining: 9.62s
108:	learn: 0.0354702	total: 7.48s	remaining: 9.54s
109:	learn: 0.0352299	total: 7.56s	remaining: 9.49s
110:	learn: 0.0350254	total: 7.62s	remaining: 9.41s
111:	learn: 0.0348567	total: 7.71s	remaining: 9.36s
112:	learn: 0.0346135	total: 7.79s	remaining: 9.31s
113:	learn: 0.0342909	total: 7.86s	remaining: 9.23s
114:	learn: 0.0340348	total: 7.92s	remaining: 9.16s
115:	learn: 0.0338432	total: 7.99s	remaining: 9.09s
116:	learn: 0.0336731	total: 8.06s	remaining: 9.02s
117:	learn: 0.0334935	total: 8.12s	remaining: 8.95s
118:	learn: 0.0332680	total: 8.18s	remaining: 8.87s
119:	learn: 0.0330680	total: 8.24s	remaining: 8.79s
120:	learn: 0.0327999	total: 8.31s	remaining: 8.73s
121:	learn: 0.0325393	total: 8.37s	remaining: 8.65s
122:	learn: 0.0323762	total: 8.43s	remaining: 8.57s
123:	learn: 0.0322423	total: 8.49s	remaining: 8.49s
124:	learn: 0.0320547	total: 8.57s	remaining: 8.44s
125:	learn: 0.0317961	total: 8.64s	remaining: 8.37s
126:	learn: 0.0316730	total: 8.72s	remaining: 8.31s
127:	learn: 0.0314894	total: 8.79s	remaining: 8.25s
128:	learn: 0.0312366	total: 8.86s	remaining: 8.17s
129:	learn: 0.0310577	total: 8.93s	remaining: 8.1s
130:	learn: 0.0309023	total: 9.04s	remaining: 8.07s
131:	learn: 0.0307568	total: 9.14s	remaining: 8.04s
132:	learn: 0.0305414	total: 9.26s	remaining: 8.01s
133:	learn: 0.0303417	total: 9.38s	remaining: 7.98s
134:	learn: 0.0302003	total: 9.49s	remaining: 7.94s
135:	learn: 0.0300865	total: 9.62s	remaining: 7.92s
136:	learn: 0.0299479	total: 9.75s	remaining: 7.9s
137:	learn: 0.0297452	total: 9.88s	remaining: 7.88s
138:	learn: 0.0295918	total: 10s	remaining: 7.84s
139:	learn: 0.0294277	total: 10.1s	remaining: 7.82s
140:	learn: 0.0293093	total: 10.3s	remaining: 7.78s
141:	learn: 0.0290920	total: 10.4s	remaining: 7.75s
142:	learn: 0.0289547	total: 10.5s	remaining: 7.72s
143:	learn: 0.0287651	total: 10.6s	remaining: 7.67s
144:	learn: 0.0286528	total: 10.8s	remaining: 7.64s
145:	learn: 0.0285114	total: 10.9s	remaining: 7.58s
146:	learn: 0.0283673	total: 11s	remaining: 7.54s
147:	learn: 0.0282189	total: 11.1s	remaining: 7.49s
148:	learn: 0.0281004	total: 11.2s	remaining: 7.45s
149:	learn: 0.0279933	total: 11.3s	remaining: 7.41s
150:	learn: 0.0278246	total: 11.5s	remaining: 7.37s
151:	learn: 0.0277269	total: 11.6s	remaining: 7.32s
152:	learn: 0.0275860	total: 11.7s	remaining: 7.29s
153:	learn: 0.0274393	total: 11.9s	remaining: 7.25s
154:	learn: 0.0273200	total: 12s	remaining: 7.2s
155:	learn: 0.0270714	total: 12.1s	remaining: 7.16s
156:	learn: 0.0269091	total: 12.3s	remaining: 7.11s
157:	learn: 0.0266914	total: 12.4s	remaining: 7.06s
158:	learn: 0.0265744	total: 12.5s	remaining: 7s
159:	learn: 0.0264912	total: 12.6s	remaining: 6.95s
160:	learn: 0.0263993	total: 12.8s	remaining: 6.91s
161:	learn: 0.0263088	total: 12.9s	remaining: 6.84s
162:	learn: 0.0261435	total: 13s	remaining: 6.79s
163:	learn: 0.0260518	total: 13.2s	remaining: 6.74s
164:	learn: 0.0259767	total: 13.3s	remaining: 6.69s
165:	learn: 0.0258766	total: 13.4s	remaining: 6.62s
166:	learn: 0.0256767	total: 13.5s	remaining: 6.56s
167:	learn: 0.0255593	total: 13.7s	remaining: 6.5s
168:	learn: 0.0253988	total: 13.8s	remaining: 6.45s
169:	learn: 0.0253240	total: 13.9s	remaining: 6.39s
170:	learn: 0.0251889	total: 14.1s	remaining: 6.33s
171:	learn: 0.0250999	total: 14.2s	remaining: 6.26s
172:	learn: 0.0250167	total: 14.3s	remaining: 6.19s
173:	learn: 0.0248931	total: 14.4s	remaining: 6.13s
174:	learn: 0.0247494	total: 14.5s	remaining: 6.05s
175:	learn: 0.0246772	total: 14.6s	remaining: 5.96s
176:	learn: 0.0245843	total: 14.7s	remaining: 5.88s
177:	learn: 0.0245013	total: 14.7s	remaining: 5.79s
178:	learn: 0.0243995	total: 14.8s	remaining: 5.7s
179:	learn: 0.0243048	total: 14.8s	remaining: 5.61s
180:	learn: 0.0241687	total: 14.9s	remaining: 5.52s
181:	learn: 0.0240747	total: 15s	remaining: 5.43s
182:	learn: 0.0240064	total: 15s	remaining: 5.34s
183:	learn: 0.0238989	total: 15.1s	remaining: 5.26s
184:	learn: 0.0237764	total: 15.2s	remaining: 5.17s
185:	learn: 0.0236826	total: 15.3s	remaining: 5.08s
186:	learn: 0.0236112	total: 15.3s	remaining: 5s
187:	learn: 0.0235401	total: 15.4s	remaining: 4.91s
188:	learn: 0.0234394	total: 15.4s	remaining: 4.82s
189:	learn: 0.0233599	total: 15.5s	remaining: 4.74s
190:	learn: 0.0232627	total: 15.6s	remaining: 4.65s
191:	learn: 0.0231520	total: 15.7s	remaining: 4.57s
192:	learn: 0.0230684	total: 15.7s	remaining: 4.48s
193:	learn: 0.0229690	total: 15.8s	remaining: 4.39s
194:	learn: 0.0228900	total: 15.8s	remaining: 4.31s
195:	learn: 0.0227887	total: 15.9s	remaining: 4.22s
196:	learn: 0.0227129	total: 16s	remaining: 4.14s
197:	learn: 0.0225915	total: 16.1s	remaining: 4.05s
198:	learn: 0.0225020	total: 16.1s	remaining: 3.97s
199:	learn: 0.0224278	total: 16.2s	remaining: 3.88s
200:	learn: 0.0223229	total: 16.3s	remaining: 3.8s
201:	learn: 0.0222464	total: 16.3s	remaining: 3.72s
202:	learn: 0.0221808	total: 16.4s	remaining: 3.63s
203:	learn: 0.0221307	total: 16.5s	remaining: 3.55s
204:	learn: 0.0219801	total: 16.5s	remaining: 3.47s
205:	learn: 0.0218891	total: 16.6s	remaining: 3.38s
206:	learn: 0.0217637	total: 16.7s	remaining: 3.3s
207:	learn: 0.0216431	total: 16.7s	remaining: 3.22s
208:	learn: 0.0215637	total: 16.8s	remaining: 3.14s
209:	learn: 0.0215111	total: 16.9s	remaining: 3.05s
210:	learn: 0.0214377	total: 16.9s	remaining: 2.97s
211:	learn: 0.0213700	total: 17s	remaining: 2.89s
212:	learn: 0.0212757	total: 17.1s	remaining: 2.8s
213:	learn: 0.0212045	total: 17.1s	remaining: 2.72s
214:	learn: 0.0211393	total: 17.2s	remaining: 2.64s
215:	learn: 0.0210417	total: 17.3s	remaining: 2.56s
216:	learn: 0.0209435	total: 17.3s	remaining: 2.48s
217:	learn: 0.0208833	total: 17.4s	remaining: 2.4s
218:	learn: 0.0208016	total: 17.5s	remaining: 2.31s
219:	learn: 0.0206767	total: 17.5s	remaining: 2.23s
220:	learn: 0.0206153	total: 17.6s	remaining: 2.15s
221:	learn: 0.0205616	total: 17.7s	remaining: 2.07s
222:	learn: 0.0204921	total: 17.7s	remaining: 1.99s
223:	learn: 0.0204087	total: 17.8s	remaining: 1.91s
224:	learn: 0.0203534	total: 17.9s	remaining: 1.83s
225:	learn: 0.0202747	total: 17.9s	remaining: 1.75s
226:	learn: 0.0202150	total: 18s	remaining: 1.67s
227:	learn: 0.0201532	total: 18.1s	remaining: 1.58s
228:	learn: 0.0201095	total: 18.1s	remaining: 1.5s
229:	learn: 0.0200440	total: 18.2s	remaining: 1.42s
230:	learn: 0.0199088	total: 18.3s	remaining: 1.34s
231:	learn: 0.0198540	total: 18.3s	remaining: 1.26s
232:	learn: 0.0197806	total: 18.4s	remaining: 1.19s
233:	learn: 0.0197205	total: 18.5s	remaining: 1.1s
234:	learn: 0.0195839	total: 18.5s	remaining: 1.02s
235:	learn: 0.0194935	total: 18.6s	remaining: 946ms
236:	learn: 0.0194263	total: 18.7s	remaining: 867ms
237:	learn: 0.0193080	total: 18.7s	remaining: 787ms
238:	learn: 0.0192431	total: 18.8s	remaining: 708ms
239:	learn: 0.0191868	total: 18.9s	remaining: 629ms
240:	learn: 0.0191532	total: 18.9s	remaining: 550ms
241:	learn: 0.0190703	total: 19s	remaining: 471ms
242:	learn: 0.0189854	total: 19.1s	remaining: 392ms
243:	learn: 0.0188994	total: 19.1s	remaining: 313ms
244:	learn: 0.0188295	total: 19.2s	remaining: 235ms
245:	learn: 0.0187346	total: 19.3s	remaining: 157ms
246:	learn: 0.0186641	total: 19.3s	remaining: 78.3ms
247:	learn: 0.0185789	total: 19.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 96.49
 - Recall_Train: 100.00
 - AUPRC_Train: 99.90
 - Accuracy_Train: 98.18
 - F1-Score_Train: 98.21
 - Precision_Test: 3.91
 - Recall_Test: 91.27
 - AUPRC_Test: 64.61
 - Accuracy_Test: 96.22
 - F1-Score_Test: 7.51
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 248
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.68
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6047898	total: 63.2ms	remaining: 15.6s
1:	learn: 0.5290494	total: 126ms	remaining: 15.5s
2:	learn: 0.4650087	total: 189ms	remaining: 15.4s
3:	learn: 0.4131651	total: 254ms	remaining: 15.5s
4:	learn: 0.3764208	total: 328ms	remaining: 15.9s
5:	learn: 0.3456778	total: 406ms	remaining: 16.4s
6:	learn: 0.3117232	total: 469ms	remaining: 16.2s
7:	learn: 0.2838277	total: 538ms	remaining: 16.1s
8:	learn: 0.2617589	total: 606ms	remaining: 16.1s
9:	learn: 0.2426128	total: 671ms	remaining: 16s
10:	learn: 0.2242255	total: 739ms	remaining: 15.9s
11:	learn: 0.2150221	total: 826ms	remaining: 16.2s
12:	learn: 0.2060186	total: 886ms	remaining: 16s
13:	learn: 0.1916833	total: 950ms	remaining: 15.9s
14:	learn: 0.1853285	total: 1.01s	remaining: 15.7s
15:	learn: 0.1771002	total: 1.11s	remaining: 16.1s
16:	learn: 0.1690803	total: 1.22s	remaining: 16.6s
17:	learn: 0.1644261	total: 1.28s	remaining: 16.4s
18:	learn: 0.1583318	total: 1.36s	remaining: 16.4s
19:	learn: 0.1525299	total: 1.42s	remaining: 16.1s
20:	learn: 0.1478894	total: 1.47s	remaining: 15.9s
21:	learn: 0.1438023	total: 1.56s	remaining: 16.1s
22:	learn: 0.1386119	total: 1.63s	remaining: 16s
23:	learn: 0.1360626	total: 1.7s	remaining: 15.8s
24:	learn: 0.1329140	total: 1.76s	remaining: 15.7s
25:	learn: 0.1292431	total: 1.85s	remaining: 15.8s
26:	learn: 0.1263843	total: 1.92s	remaining: 15.7s
27:	learn: 0.1238767	total: 1.98s	remaining: 15.6s
28:	learn: 0.1200208	total: 2.05s	remaining: 15.5s
29:	learn: 0.1180788	total: 2.12s	remaining: 15.4s
30:	learn: 0.1164322	total: 2.21s	remaining: 15.5s
31:	learn: 0.1137838	total: 2.27s	remaining: 15.4s
32:	learn: 0.1120632	total: 2.35s	remaining: 15.3s
33:	learn: 0.1092411	total: 2.42s	remaining: 15.2s
34:	learn: 0.1064160	total: 2.48s	remaining: 15.1s
35:	learn: 0.1042656	total: 2.55s	remaining: 15s
36:	learn: 0.1023653	total: 2.62s	remaining: 14.9s
37:	learn: 0.1008678	total: 2.69s	remaining: 14.8s
38:	learn: 0.0994674	total: 2.75s	remaining: 14.7s
39:	learn: 0.0975737	total: 2.81s	remaining: 14.6s
40:	learn: 0.0951684	total: 2.91s	remaining: 14.7s
41:	learn: 0.0940292	total: 2.97s	remaining: 14.6s
42:	learn: 0.0922706	total: 3.07s	remaining: 14.6s
43:	learn: 0.0905422	total: 3.19s	remaining: 14.8s
44:	learn: 0.0897165	total: 3.31s	remaining: 14.9s
45:	learn: 0.0880810	total: 3.42s	remaining: 15s
46:	learn: 0.0863717	total: 3.53s	remaining: 15.1s
47:	learn: 0.0852252	total: 3.65s	remaining: 15.2s
48:	learn: 0.0845708	total: 3.77s	remaining: 15.3s
49:	learn: 0.0835569	total: 3.91s	remaining: 15.5s
50:	learn: 0.0828335	total: 4.03s	remaining: 15.6s
51:	learn: 0.0815550	total: 4.16s	remaining: 15.7s
52:	learn: 0.0807512	total: 4.28s	remaining: 15.8s
53:	learn: 0.0795258	total: 4.41s	remaining: 15.9s
54:	learn: 0.0787998	total: 4.53s	remaining: 15.9s
55:	learn: 0.0780584	total: 4.64s	remaining: 15.9s
56:	learn: 0.0767825	total: 4.77s	remaining: 16s
57:	learn: 0.0757832	total: 4.9s	remaining: 16.1s
58:	learn: 0.0752558	total: 5.02s	remaining: 16.1s
59:	learn: 0.0746713	total: 5.15s	remaining: 16.1s
60:	learn: 0.0742974	total: 5.28s	remaining: 16.2s
61:	learn: 0.0733953	total: 5.42s	remaining: 16.3s
62:	learn: 0.0726988	total: 5.56s	remaining: 16.3s
63:	learn: 0.0720780	total: 5.69s	remaining: 16.4s
64:	learn: 0.0714566	total: 5.82s	remaining: 16.4s
65:	learn: 0.0708331	total: 5.95s	remaining: 16.4s
66:	learn: 0.0702452	total: 6.08s	remaining: 16.4s
67:	learn: 0.0698706	total: 6.21s	remaining: 16.4s
68:	learn: 0.0690616	total: 6.34s	remaining: 16.4s
69:	learn: 0.0685127	total: 6.47s	remaining: 16.5s
70:	learn: 0.0681132	total: 6.6s	remaining: 16.4s
71:	learn: 0.0674720	total: 6.73s	remaining: 16.5s
72:	learn: 0.0670227	total: 6.84s	remaining: 16.4s
73:	learn: 0.0666988	total: 6.99s	remaining: 16.4s
74:	learn: 0.0661824	total: 7.12s	remaining: 16.4s
75:	learn: 0.0657975	total: 7.26s	remaining: 16.4s
76:	learn: 0.0652254	total: 7.36s	remaining: 16.3s
77:	learn: 0.0649675	total: 7.48s	remaining: 16.3s
78:	learn: 0.0642442	total: 7.62s	remaining: 16.3s
79:	learn: 0.0638945	total: 7.75s	remaining: 16.3s
80:	learn: 0.0634454	total: 7.84s	remaining: 16.2s
81:	learn: 0.0630208	total: 7.91s	remaining: 16s
82:	learn: 0.0626013	total: 7.98s	remaining: 15.9s
83:	learn: 0.0622921	total: 8.04s	remaining: 15.7s
84:	learn: 0.0619070	total: 8.1s	remaining: 15.5s
85:	learn: 0.0615226	total: 8.19s	remaining: 15.4s
86:	learn: 0.0610117	total: 8.26s	remaining: 15.3s
87:	learn: 0.0607296	total: 8.32s	remaining: 15.1s
88:	learn: 0.0604344	total: 8.38s	remaining: 15s
89:	learn: 0.0600372	total: 8.45s	remaining: 14.8s
90:	learn: 0.0596349	total: 8.52s	remaining: 14.7s
91:	learn: 0.0592966	total: 8.57s	remaining: 14.5s
92:	learn: 0.0589714	total: 8.63s	remaining: 14.4s
93:	learn: 0.0584358	total: 8.71s	remaining: 14.3s
94:	learn: 0.0579546	total: 8.78s	remaining: 14.1s
95:	learn: 0.0574980	total: 8.84s	remaining: 14s
96:	learn: 0.0571417	total: 8.91s	remaining: 13.9s
97:	learn: 0.0567425	total: 8.98s	remaining: 13.8s
98:	learn: 0.0565226	total: 9.04s	remaining: 13.6s
99:	learn: 0.0562001	total: 9.12s	remaining: 13.5s
100:	learn: 0.0559776	total: 9.21s	remaining: 13.4s
101:	learn: 0.0557057	total: 9.27s	remaining: 13.3s
102:	learn: 0.0553009	total: 9.35s	remaining: 13.2s
103:	learn: 0.0549929	total: 9.41s	remaining: 13s
104:	learn: 0.0547235	total: 9.47s	remaining: 12.9s
105:	learn: 0.0543247	total: 9.54s	remaining: 12.8s
106:	learn: 0.0538772	total: 9.62s	remaining: 12.7s
107:	learn: 0.0536396	total: 9.68s	remaining: 12.5s
108:	learn: 0.0532705	total: 9.74s	remaining: 12.4s
109:	learn: 0.0530471	total: 9.8s	remaining: 12.3s
110:	learn: 0.0527912	total: 9.87s	remaining: 12.2s
111:	learn: 0.0525559	total: 9.95s	remaining: 12.1s
112:	learn: 0.0522405	total: 10s	remaining: 12s
113:	learn: 0.0519880	total: 10.1s	remaining: 11.8s
114:	learn: 0.0515994	total: 10.1s	remaining: 11.7s
115:	learn: 0.0513727	total: 10.2s	remaining: 11.6s
116:	learn: 0.0511795	total: 10.3s	remaining: 11.5s
117:	learn: 0.0509482	total: 10.3s	remaining: 11.4s
118:	learn: 0.0507078	total: 10.4s	remaining: 11.3s
119:	learn: 0.0505379	total: 10.5s	remaining: 11.2s
120:	learn: 0.0503126	total: 10.5s	remaining: 11.1s
121:	learn: 0.0500038	total: 10.6s	remaining: 11s
122:	learn: 0.0496953	total: 10.7s	remaining: 10.9s
123:	learn: 0.0495413	total: 10.7s	remaining: 10.7s
124:	learn: 0.0493544	total: 10.8s	remaining: 10.6s
125:	learn: 0.0491249	total: 10.9s	remaining: 10.5s
126:	learn: 0.0488856	total: 10.9s	remaining: 10.4s
127:	learn: 0.0486581	total: 11s	remaining: 10.3s
128:	learn: 0.0482877	total: 11.1s	remaining: 10.2s
129:	learn: 0.0479896	total: 11.1s	remaining: 10.1s
130:	learn: 0.0476511	total: 11.2s	remaining: 10s
131:	learn: 0.0472848	total: 11.3s	remaining: 9.92s
132:	learn: 0.0470828	total: 11.3s	remaining: 9.81s
133:	learn: 0.0468550	total: 11.4s	remaining: 9.7s
134:	learn: 0.0467173	total: 11.5s	remaining: 9.61s
135:	learn: 0.0465590	total: 11.5s	remaining: 9.5s
136:	learn: 0.0461073	total: 11.6s	remaining: 9.4s
137:	learn: 0.0458855	total: 11.7s	remaining: 9.29s
138:	learn: 0.0455768	total: 11.7s	remaining: 9.2s
139:	learn: 0.0454386	total: 11.8s	remaining: 9.09s
140:	learn: 0.0452460	total: 11.8s	remaining: 8.99s
141:	learn: 0.0450111	total: 11.9s	remaining: 8.89s
142:	learn: 0.0447851	total: 12s	remaining: 8.8s
143:	learn: 0.0445826	total: 12.1s	remaining: 8.7s
144:	learn: 0.0443667	total: 12.1s	remaining: 8.61s
145:	learn: 0.0442078	total: 12.2s	remaining: 8.51s
146:	learn: 0.0440053	total: 12.3s	remaining: 8.43s
147:	learn: 0.0438650	total: 12.3s	remaining: 8.33s
148:	learn: 0.0436862	total: 12.4s	remaining: 8.24s
149:	learn: 0.0435298	total: 12.5s	remaining: 8.14s
150:	learn: 0.0433974	total: 12.5s	remaining: 8.05s
151:	learn: 0.0431248	total: 12.6s	remaining: 7.96s
152:	learn: 0.0430239	total: 12.7s	remaining: 7.86s
153:	learn: 0.0428941	total: 12.7s	remaining: 7.76s
154:	learn: 0.0427302	total: 12.8s	remaining: 7.67s
155:	learn: 0.0425413	total: 12.8s	remaining: 7.58s
156:	learn: 0.0424036	total: 12.9s	remaining: 7.48s
157:	learn: 0.0421482	total: 13s	remaining: 7.39s
158:	learn: 0.0419359	total: 13s	remaining: 7.3s
159:	learn: 0.0417191	total: 13.1s	remaining: 7.21s
160:	learn: 0.0414546	total: 13.2s	remaining: 7.12s
161:	learn: 0.0413150	total: 13.2s	remaining: 7.03s
162:	learn: 0.0411571	total: 13.3s	remaining: 6.95s
163:	learn: 0.0410310	total: 13.4s	remaining: 6.86s
164:	learn: 0.0407518	total: 13.5s	remaining: 6.77s
165:	learn: 0.0405802	total: 13.5s	remaining: 6.68s
166:	learn: 0.0404335	total: 13.6s	remaining: 6.59s
167:	learn: 0.0402887	total: 13.7s	remaining: 6.51s
168:	learn: 0.0401806	total: 13.7s	remaining: 6.42s
169:	learn: 0.0399305	total: 13.8s	remaining: 6.33s
170:	learn: 0.0397803	total: 13.9s	remaining: 6.26s
171:	learn: 0.0396735	total: 13.9s	remaining: 6.16s
172:	learn: 0.0394581	total: 14s	remaining: 6.07s
173:	learn: 0.0392833	total: 14.1s	remaining: 5.99s
174:	learn: 0.0391576	total: 14.2s	remaining: 5.91s
175:	learn: 0.0389018	total: 14.2s	remaining: 5.82s
176:	learn: 0.0386844	total: 14.3s	remaining: 5.73s
177:	learn: 0.0385720	total: 14.4s	remaining: 5.65s
178:	learn: 0.0383755	total: 14.4s	remaining: 5.57s
179:	learn: 0.0381341	total: 14.5s	remaining: 5.48s
180:	learn: 0.0380364	total: 14.6s	remaining: 5.39s
181:	learn: 0.0379280	total: 14.6s	remaining: 5.3s
182:	learn: 0.0377587	total: 14.7s	remaining: 5.22s
183:	learn: 0.0376786	total: 14.8s	remaining: 5.13s
184:	learn: 0.0375758	total: 14.8s	remaining: 5.04s
185:	learn: 0.0374278	total: 14.9s	remaining: 4.96s
186:	learn: 0.0372866	total: 14.9s	remaining: 4.87s
187:	learn: 0.0371159	total: 15s	remaining: 4.79s
188:	learn: 0.0369723	total: 15.1s	remaining: 4.7s
189:	learn: 0.0368524	total: 15.1s	remaining: 4.62s
190:	learn: 0.0366278	total: 15.2s	remaining: 4.54s
191:	learn: 0.0364239	total: 15.3s	remaining: 4.45s
192:	learn: 0.0362579	total: 15.3s	remaining: 4.37s
193:	learn: 0.0361497	total: 15.4s	remaining: 4.29s
194:	learn: 0.0358636	total: 15.5s	remaining: 4.21s
195:	learn: 0.0356636	total: 15.5s	remaining: 4.12s
196:	learn: 0.0355510	total: 15.6s	remaining: 4.04s
197:	learn: 0.0354074	total: 15.7s	remaining: 3.96s
198:	learn: 0.0352630	total: 15.7s	remaining: 3.87s
199:	learn: 0.0350854	total: 15.8s	remaining: 3.79s
200:	learn: 0.0349425	total: 15.9s	remaining: 3.71s
201:	learn: 0.0347814	total: 15.9s	remaining: 3.63s
202:	learn: 0.0345733	total: 16s	remaining: 3.55s
203:	learn: 0.0344711	total: 16.1s	remaining: 3.46s
204:	learn: 0.0343359	total: 16.1s	remaining: 3.38s
205:	learn: 0.0341087	total: 16.2s	remaining: 3.3s
206:	learn: 0.0339780	total: 16.3s	remaining: 3.22s
207:	learn: 0.0338694	total: 16.3s	remaining: 3.14s
208:	learn: 0.0337803	total: 16.4s	remaining: 3.06s
209:	learn: 0.0336906	total: 16.5s	remaining: 2.98s
210:	learn: 0.0336173	total: 16.5s	remaining: 2.9s
211:	learn: 0.0335246	total: 16.6s	remaining: 2.82s
212:	learn: 0.0334380	total: 16.7s	remaining: 2.74s
213:	learn: 0.0333633	total: 16.7s	remaining: 2.66s
214:	learn: 0.0331157	total: 16.8s	remaining: 2.58s
215:	learn: 0.0330005	total: 16.9s	remaining: 2.5s
216:	learn: 0.0328632	total: 16.9s	remaining: 2.42s
217:	learn: 0.0326594	total: 17s	remaining: 2.34s
218:	learn: 0.0325104	total: 17.1s	remaining: 2.26s
219:	learn: 0.0323855	total: 17.1s	remaining: 2.18s
220:	learn: 0.0322865	total: 17.2s	remaining: 2.1s
221:	learn: 0.0320659	total: 17.3s	remaining: 2.02s
222:	learn: 0.0318481	total: 17.3s	remaining: 1.94s
223:	learn: 0.0317643	total: 17.4s	remaining: 1.86s
224:	learn: 0.0316660	total: 17.5s	remaining: 1.78s
225:	learn: 0.0315504	total: 17.5s	remaining: 1.71s
226:	learn: 0.0314020	total: 17.6s	remaining: 1.63s
227:	learn: 0.0312559	total: 17.7s	remaining: 1.55s
228:	learn: 0.0311532	total: 17.7s	remaining: 1.47s
229:	learn: 0.0310174	total: 17.8s	remaining: 1.4s
230:	learn: 0.0308636	total: 17.9s	remaining: 1.32s
231:	learn: 0.0307757	total: 18.1s	remaining: 1.25s
232:	learn: 0.0306300	total: 18.2s	remaining: 1.17s
233:	learn: 0.0305010	total: 18.3s	remaining: 1.1s
234:	learn: 0.0303479	total: 18.5s	remaining: 1.02s
235:	learn: 0.0301603	total: 18.6s	remaining: 947ms
236:	learn: 0.0300237	total: 18.8s	remaining: 871ms
237:	learn: 0.0299330	total: 18.9s	remaining: 794ms
238:	learn: 0.0298104	total: 19s	remaining: 716ms
239:	learn: 0.0297063	total: 19.2s	remaining: 639ms
240:	learn: 0.0295186	total: 19.3s	remaining: 560ms
241:	learn: 0.0294374	total: 19.4s	remaining: 481ms
242:	learn: 0.0293288	total: 19.5s	remaining: 402ms
243:	learn: 0.0292335	total: 19.7s	remaining: 322ms
244:	learn: 0.0291249	total: 19.8s	remaining: 242ms
245:	learn: 0.0290077	total: 19.9s	remaining: 162ms
246:	learn: 0.0288441	total: 20s	remaining: 81.2ms
247:	learn: 0.0287279	total: 20.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.91
 - Recall_Train: 99.99
 - AUPRC_Train: 99.88
 - Accuracy_Train: 97.32
 - F1-Score_Train: 97.39
 - Precision_Test: 2.89
 - Recall_Test: 96.03
 - AUPRC_Test: 71.43
 - Accuracy_Test: 94.56
 - F1-Score_Test: 5.61
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 248
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.68
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6017198	total: 132ms	remaining: 32.6s
1:	learn: 0.5420918	total: 278ms	remaining: 34.2s
2:	learn: 0.4780178	total: 414ms	remaining: 33.8s
3:	learn: 0.4316200	total: 535ms	remaining: 32.6s
4:	learn: 0.3762402	total: 675ms	remaining: 32.8s
5:	learn: 0.3329890	total: 781ms	remaining: 31.5s
6:	learn: 0.2946375	total: 921ms	remaining: 31.7s
7:	learn: 0.2663525	total: 1.06s	remaining: 31.7s
8:	learn: 0.2423697	total: 1.19s	remaining: 31.6s
9:	learn: 0.2246004	total: 1.31s	remaining: 31.1s
10:	learn: 0.2058813	total: 1.44s	remaining: 31s
11:	learn: 0.1923280	total: 1.57s	remaining: 30.9s
12:	learn: 0.1796801	total: 1.7s	remaining: 30.7s
13:	learn: 0.1646791	total: 1.83s	remaining: 30.6s
14:	learn: 0.1552936	total: 1.97s	remaining: 30.5s
15:	learn: 0.1477248	total: 2.12s	remaining: 30.7s
16:	learn: 0.1401649	total: 2.26s	remaining: 30.7s
17:	learn: 0.1342764	total: 2.38s	remaining: 30.4s
18:	learn: 0.1304155	total: 2.51s	remaining: 30.3s
19:	learn: 0.1260217	total: 2.63s	remaining: 30s
20:	learn: 0.1229078	total: 2.77s	remaining: 29.9s
21:	learn: 0.1200038	total: 2.89s	remaining: 29.7s
22:	learn: 0.1153659	total: 3.01s	remaining: 29.5s
23:	learn: 0.1131024	total: 3.15s	remaining: 29.4s
24:	learn: 0.1096786	total: 3.31s	remaining: 29.5s
25:	learn: 0.1067842	total: 3.44s	remaining: 29.3s
26:	learn: 0.1046638	total: 3.56s	remaining: 29.2s
27:	learn: 0.1016071	total: 3.7s	remaining: 29.1s
28:	learn: 0.0997050	total: 3.83s	remaining: 28.9s
29:	learn: 0.0981703	total: 3.89s	remaining: 28.3s
30:	learn: 0.0968353	total: 3.95s	remaining: 27.7s
31:	learn: 0.0945113	total: 4.02s	remaining: 27.1s
32:	learn: 0.0930416	total: 4.1s	remaining: 26.7s
33:	learn: 0.0913029	total: 4.17s	remaining: 26.2s
34:	learn: 0.0897895	total: 4.23s	remaining: 25.8s
35:	learn: 0.0887146	total: 4.32s	remaining: 25.4s
36:	learn: 0.0877643	total: 4.38s	remaining: 25s
37:	learn: 0.0862637	total: 4.44s	remaining: 24.5s
38:	learn: 0.0851433	total: 4.51s	remaining: 24.2s
39:	learn: 0.0834778	total: 4.58s	remaining: 23.8s
40:	learn: 0.0814284	total: 4.64s	remaining: 23.4s
41:	learn: 0.0801509	total: 4.71s	remaining: 23.1s
42:	learn: 0.0783715	total: 4.78s	remaining: 22.8s
43:	learn: 0.0776656	total: 4.85s	remaining: 22.5s
44:	learn: 0.0762058	total: 4.92s	remaining: 22.2s
45:	learn: 0.0753975	total: 4.98s	remaining: 21.9s
46:	learn: 0.0746987	total: 5.03s	remaining: 21.5s
47:	learn: 0.0740545	total: 5.11s	remaining: 21.3s
48:	learn: 0.0725578	total: 5.18s	remaining: 21s
49:	learn: 0.0712804	total: 5.25s	remaining: 20.8s
50:	learn: 0.0703387	total: 5.33s	remaining: 20.6s
51:	learn: 0.0686529	total: 5.4s	remaining: 20.4s
52:	learn: 0.0679245	total: 5.47s	remaining: 20.1s
53:	learn: 0.0674753	total: 5.54s	remaining: 19.9s
54:	learn: 0.0668747	total: 5.61s	remaining: 19.7s
55:	learn: 0.0660609	total: 5.67s	remaining: 19.5s
56:	learn: 0.0654565	total: 5.73s	remaining: 19.2s
57:	learn: 0.0648237	total: 5.8s	remaining: 19s
58:	learn: 0.0642784	total: 5.87s	remaining: 18.8s
59:	learn: 0.0635684	total: 5.94s	remaining: 18.6s
60:	learn: 0.0631046	total: 6.01s	remaining: 18.4s
61:	learn: 0.0624694	total: 6.09s	remaining: 18.3s
62:	learn: 0.0618136	total: 6.15s	remaining: 18.1s
63:	learn: 0.0614296	total: 6.21s	remaining: 17.8s
64:	learn: 0.0607112	total: 6.28s	remaining: 17.7s
65:	learn: 0.0602738	total: 6.35s	remaining: 17.5s
66:	learn: 0.0597569	total: 6.41s	remaining: 17.3s
67:	learn: 0.0592252	total: 6.48s	remaining: 17.2s
68:	learn: 0.0585190	total: 6.56s	remaining: 17s
69:	learn: 0.0581756	total: 6.62s	remaining: 16.8s
70:	learn: 0.0576324	total: 6.69s	remaining: 16.7s
71:	learn: 0.0570329	total: 6.77s	remaining: 16.5s
72:	learn: 0.0566977	total: 6.83s	remaining: 16.4s
73:	learn: 0.0564011	total: 6.88s	remaining: 16.2s
74:	learn: 0.0559042	total: 6.95s	remaining: 16s
75:	learn: 0.0553870	total: 7.01s	remaining: 15.9s
76:	learn: 0.0550000	total: 7.08s	remaining: 15.7s
77:	learn: 0.0547279	total: 7.14s	remaining: 15.6s
78:	learn: 0.0544122	total: 7.22s	remaining: 15.4s
79:	learn: 0.0540502	total: 7.28s	remaining: 15.3s
80:	learn: 0.0535001	total: 7.35s	remaining: 15.2s
81:	learn: 0.0528414	total: 7.42s	remaining: 15s
82:	learn: 0.0522367	total: 7.49s	remaining: 14.9s
83:	learn: 0.0519800	total: 7.55s	remaining: 14.7s
84:	learn: 0.0515336	total: 7.61s	remaining: 14.6s
85:	learn: 0.0511728	total: 7.68s	remaining: 14.5s
86:	learn: 0.0508160	total: 7.75s	remaining: 14.3s
87:	learn: 0.0505480	total: 7.81s	remaining: 14.2s
88:	learn: 0.0501913	total: 7.87s	remaining: 14.1s
89:	learn: 0.0499257	total: 7.93s	remaining: 13.9s
90:	learn: 0.0492353	total: 8.02s	remaining: 13.8s
91:	learn: 0.0489349	total: 8.1s	remaining: 13.7s
92:	learn: 0.0485043	total: 8.17s	remaining: 13.6s
93:	learn: 0.0481957	total: 8.25s	remaining: 13.5s
94:	learn: 0.0478016	total: 8.31s	remaining: 13.4s
95:	learn: 0.0475835	total: 8.38s	remaining: 13.3s
96:	learn: 0.0471772	total: 8.44s	remaining: 13.1s
97:	learn: 0.0468261	total: 8.52s	remaining: 13s
98:	learn: 0.0464787	total: 8.59s	remaining: 12.9s
99:	learn: 0.0461100	total: 8.65s	remaining: 12.8s
100:	learn: 0.0458024	total: 8.71s	remaining: 12.7s
101:	learn: 0.0456198	total: 8.79s	remaining: 12.6s
102:	learn: 0.0454485	total: 8.85s	remaining: 12.5s
103:	learn: 0.0452144	total: 8.91s	remaining: 12.3s
104:	learn: 0.0449236	total: 8.97s	remaining: 12.2s
105:	learn: 0.0446405	total: 9.04s	remaining: 12.1s
106:	learn: 0.0443587	total: 9.11s	remaining: 12s
107:	learn: 0.0440353	total: 9.17s	remaining: 11.9s
108:	learn: 0.0436889	total: 9.24s	remaining: 11.8s
109:	learn: 0.0433620	total: 9.32s	remaining: 11.7s
110:	learn: 0.0431693	total: 9.38s	remaining: 11.6s
111:	learn: 0.0427966	total: 9.46s	remaining: 11.5s
112:	learn: 0.0425130	total: 9.53s	remaining: 11.4s
113:	learn: 0.0422401	total: 9.6s	remaining: 11.3s
114:	learn: 0.0420708	total: 9.66s	remaining: 11.2s
115:	learn: 0.0418340	total: 9.72s	remaining: 11.1s
116:	learn: 0.0416180	total: 9.8s	remaining: 11s
117:	learn: 0.0413710	total: 9.87s	remaining: 10.9s
118:	learn: 0.0412006	total: 9.92s	remaining: 10.8s
119:	learn: 0.0408965	total: 9.98s	remaining: 10.6s
120:	learn: 0.0406905	total: 10.1s	remaining: 10.6s
121:	learn: 0.0404972	total: 10.1s	remaining: 10.5s
122:	learn: 0.0401705	total: 10.2s	remaining: 10.4s
123:	learn: 0.0399224	total: 10.3s	remaining: 10.3s
124:	learn: 0.0397370	total: 10.3s	remaining: 10.2s
125:	learn: 0.0395614	total: 10.4s	remaining: 10.1s
126:	learn: 0.0394052	total: 10.5s	remaining: 9.98s
127:	learn: 0.0392213	total: 10.5s	remaining: 9.87s
128:	learn: 0.0389649	total: 10.6s	remaining: 9.78s
129:	learn: 0.0388000	total: 10.7s	remaining: 9.68s
130:	learn: 0.0385438	total: 10.7s	remaining: 9.58s
131:	learn: 0.0382612	total: 10.8s	remaining: 9.48s
132:	learn: 0.0380261	total: 10.9s	remaining: 9.39s
133:	learn: 0.0378748	total: 10.9s	remaining: 9.3s
134:	learn: 0.0377381	total: 11s	remaining: 9.2s
135:	learn: 0.0374347	total: 11.1s	remaining: 9.1s
136:	learn: 0.0373222	total: 11.1s	remaining: 9.01s
137:	learn: 0.0371009	total: 11.2s	remaining: 8.92s
138:	learn: 0.0368752	total: 11.2s	remaining: 8.82s
139:	learn: 0.0366121	total: 11.3s	remaining: 8.72s
140:	learn: 0.0363042	total: 11.4s	remaining: 8.64s
141:	learn: 0.0362081	total: 11.5s	remaining: 8.55s
142:	learn: 0.0360561	total: 11.5s	remaining: 8.46s
143:	learn: 0.0358294	total: 11.6s	remaining: 8.36s
144:	learn: 0.0356450	total: 11.7s	remaining: 8.28s
145:	learn: 0.0355102	total: 11.7s	remaining: 8.19s
146:	learn: 0.0352968	total: 11.8s	remaining: 8.1s
147:	learn: 0.0351507	total: 11.8s	remaining: 8s
148:	learn: 0.0350258	total: 11.9s	remaining: 7.92s
149:	learn: 0.0348395	total: 12s	remaining: 7.83s
150:	learn: 0.0346632	total: 12.1s	remaining: 7.75s
151:	learn: 0.0344685	total: 12.1s	remaining: 7.66s
152:	learn: 0.0343067	total: 12.2s	remaining: 7.57s
153:	learn: 0.0340803	total: 12.3s	remaining: 7.49s
154:	learn: 0.0337914	total: 12.3s	remaining: 7.4s
155:	learn: 0.0336946	total: 12.4s	remaining: 7.31s
156:	learn: 0.0335441	total: 12.5s	remaining: 7.23s
157:	learn: 0.0334110	total: 12.5s	remaining: 7.14s
158:	learn: 0.0332962	total: 12.6s	remaining: 7.05s
159:	learn: 0.0331971	total: 12.7s	remaining: 6.97s
160:	learn: 0.0330411	total: 12.7s	remaining: 6.88s
161:	learn: 0.0329268	total: 12.8s	remaining: 6.79s
162:	learn: 0.0328060	total: 12.8s	remaining: 6.7s
163:	learn: 0.0327028	total: 12.9s	remaining: 6.62s
164:	learn: 0.0325863	total: 13s	remaining: 6.53s
165:	learn: 0.0324969	total: 13s	remaining: 6.45s
166:	learn: 0.0323249	total: 13.1s	remaining: 6.36s
167:	learn: 0.0321733	total: 13.2s	remaining: 6.28s
168:	learn: 0.0320319	total: 13.2s	remaining: 6.19s
169:	learn: 0.0318586	total: 13.3s	remaining: 6.1s
170:	learn: 0.0317164	total: 13.4s	remaining: 6.02s
171:	learn: 0.0314625	total: 13.4s	remaining: 5.94s
172:	learn: 0.0312310	total: 13.5s	remaining: 5.86s
173:	learn: 0.0310652	total: 13.6s	remaining: 5.78s
174:	learn: 0.0308920	total: 13.7s	remaining: 5.7s
175:	learn: 0.0307968	total: 13.7s	remaining: 5.61s
176:	learn: 0.0306491	total: 13.8s	remaining: 5.53s
177:	learn: 0.0305670	total: 13.9s	remaining: 5.46s
178:	learn: 0.0303679	total: 14s	remaining: 5.39s
179:	learn: 0.0302844	total: 14.1s	remaining: 5.33s
180:	learn: 0.0301656	total: 14.2s	remaining: 5.27s
181:	learn: 0.0300389	total: 14.4s	remaining: 5.21s
182:	learn: 0.0298659	total: 14.5s	remaining: 5.14s
183:	learn: 0.0297672	total: 14.6s	remaining: 5.08s
184:	learn: 0.0296847	total: 14.7s	remaining: 5.02s
185:	learn: 0.0295932	total: 14.9s	remaining: 4.95s
186:	learn: 0.0294047	total: 15s	remaining: 4.89s
187:	learn: 0.0293316	total: 15.1s	remaining: 4.82s
188:	learn: 0.0292287	total: 15.2s	remaining: 4.75s
189:	learn: 0.0291133	total: 15.4s	remaining: 4.69s
190:	learn: 0.0290418	total: 15.5s	remaining: 4.62s
191:	learn: 0.0289457	total: 15.6s	remaining: 4.55s
192:	learn: 0.0287496	total: 15.7s	remaining: 4.48s
193:	learn: 0.0285827	total: 15.8s	remaining: 4.41s
194:	learn: 0.0284472	total: 16s	remaining: 4.33s
195:	learn: 0.0283336	total: 16.1s	remaining: 4.26s
196:	learn: 0.0282165	total: 16.2s	remaining: 4.19s
197:	learn: 0.0281081	total: 16.3s	remaining: 4.12s
198:	learn: 0.0280172	total: 16.4s	remaining: 4.04s
199:	learn: 0.0279194	total: 16.5s	remaining: 3.96s
200:	learn: 0.0278168	total: 16.6s	remaining: 3.89s
201:	learn: 0.0276963	total: 16.8s	remaining: 3.82s
202:	learn: 0.0276130	total: 16.9s	remaining: 3.74s
203:	learn: 0.0275530	total: 17s	remaining: 3.67s
204:	learn: 0.0274076	total: 17.1s	remaining: 3.6s
205:	learn: 0.0273282	total: 17.3s	remaining: 3.52s
206:	learn: 0.0272080	total: 17.4s	remaining: 3.44s
207:	learn: 0.0270116	total: 17.5s	remaining: 3.37s
208:	learn: 0.0269322	total: 17.6s	remaining: 3.29s
209:	learn: 0.0268573	total: 17.7s	remaining: 3.21s
210:	learn: 0.0267406	total: 17.8s	remaining: 3.13s
211:	learn: 0.0266005	total: 17.9s	remaining: 3.05s
212:	learn: 0.0264951	total: 18.1s	remaining: 2.97s
213:	learn: 0.0263522	total: 18.2s	remaining: 2.89s
214:	learn: 0.0262548	total: 18.3s	remaining: 2.81s
215:	learn: 0.0262141	total: 18.4s	remaining: 2.73s
216:	learn: 0.0260979	total: 18.5s	remaining: 2.65s
217:	learn: 0.0259907	total: 18.7s	remaining: 2.57s
218:	learn: 0.0259097	total: 18.8s	remaining: 2.49s
219:	learn: 0.0258419	total: 18.9s	remaining: 2.41s
220:	learn: 0.0257142	total: 19s	remaining: 2.33s
221:	learn: 0.0255492	total: 19.2s	remaining: 2.25s
222:	learn: 0.0254749	total: 19.3s	remaining: 2.16s
223:	learn: 0.0254204	total: 19.4s	remaining: 2.08s
224:	learn: 0.0253079	total: 19.5s	remaining: 1.99s
225:	learn: 0.0252234	total: 19.6s	remaining: 1.91s
226:	learn: 0.0251285	total: 19.7s	remaining: 1.82s
227:	learn: 0.0250342	total: 19.7s	remaining: 1.73s
228:	learn: 0.0249438	total: 19.8s	remaining: 1.64s
229:	learn: 0.0248564	total: 19.9s	remaining: 1.55s
230:	learn: 0.0247473	total: 20s	remaining: 1.47s
231:	learn: 0.0246249	total: 20s	remaining: 1.38s
232:	learn: 0.0245538	total: 20.1s	remaining: 1.29s
233:	learn: 0.0244928	total: 20.2s	remaining: 1.21s
234:	learn: 0.0244325	total: 20.2s	remaining: 1.12s
235:	learn: 0.0243491	total: 20.3s	remaining: 1.03s
236:	learn: 0.0242025	total: 20.4s	remaining: 945ms
237:	learn: 0.0241297	total: 20.4s	remaining: 859ms
238:	learn: 0.0240703	total: 20.5s	remaining: 772ms
239:	learn: 0.0240005	total: 20.6s	remaining: 685ms
240:	learn: 0.0239398	total: 20.6s	remaining: 599ms
241:	learn: 0.0238116	total: 20.7s	remaining: 513ms
242:	learn: 0.0237308	total: 20.8s	remaining: 427ms
243:	learn: 0.0236785	total: 20.8s	remaining: 341ms
244:	learn: 0.0235861	total: 20.9s	remaining: 256ms
245:	learn: 0.0235384	total: 21s	remaining: 170ms
246:	learn: 0.0234666	total: 21s	remaining: 85.2ms
247:	learn: 0.0233980	total: 21.1s	remaining: 0us
[I 2024-12-19 14:24:22,636] Trial 15 finished with value: 67.80101170304454 and parameters: {'learning_rate': 0.03857647736265097, 'max_depth': 3, 'n_estimators': 248, 'scale_pos_weight': 8.684304389345261}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 95.95
 - Recall_Train: 99.99
 - AUPRC_Train: 99.87
 - Accuracy_Train: 97.88
 - F1-Score_Train: 97.93
 - Precision_Test: 3.38
 - Recall_Test: 90.48
 - AUPRC_Test: 67.36
 - Accuracy_Test: 95.64
 - F1-Score_Test: 6.53
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 248
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.68
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 67.8010

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6196067	total: 87.5ms	remaining: 18.2s
1:	learn: 0.5500102	total: 182ms	remaining: 18.8s
2:	learn: 0.4825292	total: 276ms	remaining: 19s
3:	learn: 0.4260889	total: 379ms	remaining: 19.4s
4:	learn: 0.3766515	total: 479ms	remaining: 19.5s
5:	learn: 0.3368406	total: 574ms	remaining: 19.4s
6:	learn: 0.3033531	total: 679ms	remaining: 19.6s
7:	learn: 0.2714120	total: 790ms	remaining: 19.9s
8:	learn: 0.2510016	total: 887ms	remaining: 19.7s
9:	learn: 0.2240417	total: 990ms	remaining: 19.7s
10:	learn: 0.2031461	total: 1.08s	remaining: 19.5s
11:	learn: 0.1812339	total: 1.19s	remaining: 19.6s
12:	learn: 0.1668365	total: 1.28s	remaining: 19.4s
13:	learn: 0.1524130	total: 1.37s	remaining: 19.1s
14:	learn: 0.1418610	total: 1.48s	remaining: 19.1s
15:	learn: 0.1308017	total: 1.56s	remaining: 18.9s
16:	learn: 0.1248212	total: 1.65s	remaining: 18.7s
17:	learn: 0.1153612	total: 1.79s	remaining: 19s
18:	learn: 0.1075767	total: 1.88s	remaining: 18.8s
19:	learn: 0.1002620	total: 1.97s	remaining: 18.7s
20:	learn: 0.0955931	total: 2.09s	remaining: 18.7s
21:	learn: 0.0901298	total: 2.18s	remaining: 18.5s
22:	learn: 0.0858951	total: 2.27s	remaining: 18.4s
23:	learn: 0.0811372	total: 2.38s	remaining: 18.3s
24:	learn: 0.0781401	total: 2.47s	remaining: 18.2s
25:	learn: 0.0745462	total: 2.57s	remaining: 18.1s
26:	learn: 0.0724753	total: 2.67s	remaining: 18s
27:	learn: 0.0689974	total: 2.77s	remaining: 17.9s
28:	learn: 0.0661944	total: 2.89s	remaining: 17.9s
29:	learn: 0.0644124	total: 2.98s	remaining: 17.8s
30:	learn: 0.0624467	total: 3.07s	remaining: 17.6s
31:	learn: 0.0600949	total: 3.19s	remaining: 17.6s
32:	learn: 0.0576064	total: 3.29s	remaining: 17.5s
33:	learn: 0.0554102	total: 3.38s	remaining: 17.4s
34:	learn: 0.0534261	total: 3.5s	remaining: 17.4s
35:	learn: 0.0519761	total: 3.59s	remaining: 17.3s
36:	learn: 0.0506500	total: 3.69s	remaining: 17.1s
37:	learn: 0.0489216	total: 3.8s	remaining: 17.1s
38:	learn: 0.0476830	total: 3.91s	remaining: 17s
39:	learn: 0.0466456	total: 4s	remaining: 16.9s
40:	learn: 0.0457433	total: 4.1s	remaining: 16.8s
41:	learn: 0.0446961	total: 4.19s	remaining: 16.7s
42:	learn: 0.0435281	total: 4.28s	remaining: 16.5s
43:	learn: 0.0423079	total: 4.38s	remaining: 16.4s
44:	learn: 0.0414784	total: 4.47s	remaining: 16.3s
45:	learn: 0.0407288	total: 4.56s	remaining: 16.2s
46:	learn: 0.0397580	total: 4.67s	remaining: 16.1s
47:	learn: 0.0389517	total: 4.77s	remaining: 16s
48:	learn: 0.0382789	total: 4.87s	remaining: 15.9s
49:	learn: 0.0375283	total: 4.98s	remaining: 15.8s
50:	learn: 0.0366339	total: 5.08s	remaining: 15.7s
51:	learn: 0.0360784	total: 5.16s	remaining: 15.6s
52:	learn: 0.0354239	total: 5.27s	remaining: 15.5s
53:	learn: 0.0348401	total: 5.36s	remaining: 15.4s
54:	learn: 0.0343231	total: 5.45s	remaining: 15.3s
55:	learn: 0.0337099	total: 5.55s	remaining: 15.2s
56:	learn: 0.0331937	total: 5.64s	remaining: 15s
57:	learn: 0.0327734	total: 5.74s	remaining: 14.9s
58:	learn: 0.0322403	total: 5.83s	remaining: 14.8s
59:	learn: 0.0318353	total: 5.93s	remaining: 14.7s
60:	learn: 0.0314207	total: 6.02s	remaining: 14.6s
61:	learn: 0.0310096	total: 6.12s	remaining: 14.5s
62:	learn: 0.0306200	total: 6.22s	remaining: 14.4s
63:	learn: 0.0302638	total: 6.38s	remaining: 14.5s
64:	learn: 0.0299272	total: 6.53s	remaining: 14.5s
65:	learn: 0.0294000	total: 6.7s	remaining: 14.5s
66:	learn: 0.0289778	total: 6.87s	remaining: 14.6s
67:	learn: 0.0285676	total: 7.06s	remaining: 14.6s
68:	learn: 0.0281787	total: 7.23s	remaining: 14.7s
69:	learn: 0.0277073	total: 7.41s	remaining: 14.7s
70:	learn: 0.0273784	total: 7.59s	remaining: 14.7s
71:	learn: 0.0269076	total: 7.76s	remaining: 14.8s
72:	learn: 0.0264970	total: 7.92s	remaining: 14.8s
73:	learn: 0.0261540	total: 8.09s	remaining: 14.8s
74:	learn: 0.0258762	total: 8.26s	remaining: 14.8s
75:	learn: 0.0255729	total: 8.42s	remaining: 14.7s
76:	learn: 0.0253239	total: 8.6s	remaining: 14.7s
77:	learn: 0.0250486	total: 8.8s	remaining: 14.8s
78:	learn: 0.0248325	total: 8.98s	remaining: 14.8s
79:	learn: 0.0245611	total: 9.18s	remaining: 14.8s
80:	learn: 0.0243012	total: 9.35s	remaining: 14.8s
81:	learn: 0.0240377	total: 9.53s	remaining: 14.8s
82:	learn: 0.0237915	total: 9.69s	remaining: 14.7s
83:	learn: 0.0235028	total: 9.87s	remaining: 14.7s
84:	learn: 0.0231824	total: 10s	remaining: 14.7s
85:	learn: 0.0228829	total: 10.2s	remaining: 14.6s
86:	learn: 0.0225415	total: 10.4s	remaining: 14.6s
87:	learn: 0.0223619	total: 10.6s	remaining: 14.5s
88:	learn: 0.0221221	total: 10.7s	remaining: 14.5s
89:	learn: 0.0219168	total: 10.9s	remaining: 14.4s
90:	learn: 0.0216214	total: 11.1s	remaining: 14.4s
91:	learn: 0.0213465	total: 11.3s	remaining: 14.3s
92:	learn: 0.0211335	total: 11.4s	remaining: 14.3s
93:	learn: 0.0209501	total: 11.6s	remaining: 14.2s
94:	learn: 0.0207475	total: 11.7s	remaining: 14.1s
95:	learn: 0.0204190	total: 11.8s	remaining: 13.9s
96:	learn: 0.0201752	total: 11.9s	remaining: 13.8s
97:	learn: 0.0200154	total: 12s	remaining: 13.6s
98:	learn: 0.0197502	total: 12.1s	remaining: 13.4s
99:	learn: 0.0195339	total: 12.2s	remaining: 13.3s
100:	learn: 0.0193245	total: 12.3s	remaining: 13.1s
101:	learn: 0.0191066	total: 12.4s	remaining: 13s
102:	learn: 0.0188401	total: 12.5s	remaining: 12.9s
103:	learn: 0.0186031	total: 12.6s	remaining: 12.7s
104:	learn: 0.0184329	total: 12.7s	remaining: 12.5s
105:	learn: 0.0182445	total: 12.8s	remaining: 12.4s
106:	learn: 0.0180000	total: 12.9s	remaining: 12.3s
107:	learn: 0.0178548	total: 13s	remaining: 12.1s
108:	learn: 0.0176311	total: 13.1s	remaining: 12s
109:	learn: 0.0173979	total: 13.2s	remaining: 11.8s
110:	learn: 0.0172818	total: 13.3s	remaining: 11.7s
111:	learn: 0.0170892	total: 13.4s	remaining: 11.6s
112:	learn: 0.0169949	total: 13.4s	remaining: 11.4s
113:	learn: 0.0168502	total: 13.5s	remaining: 11.3s
114:	learn: 0.0167534	total: 13.6s	remaining: 11.2s
115:	learn: 0.0165830	total: 13.7s	remaining: 11s
116:	learn: 0.0164052	total: 13.8s	remaining: 10.9s
117:	learn: 0.0162479	total: 13.9s	remaining: 10.7s
118:	learn: 0.0161290	total: 14s	remaining: 10.6s
119:	learn: 0.0159039	total: 14.1s	remaining: 10.5s
120:	learn: 0.0157884	total: 14.2s	remaining: 10.3s
121:	learn: 0.0157004	total: 14.3s	remaining: 10.2s
122:	learn: 0.0155968	total: 14.4s	remaining: 10.1s
123:	learn: 0.0154014	total: 14.5s	remaining: 9.94s
124:	learn: 0.0152332	total: 14.6s	remaining: 9.8s
125:	learn: 0.0150423	total: 14.7s	remaining: 9.66s
126:	learn: 0.0149088	total: 14.8s	remaining: 9.54s
127:	learn: 0.0148039	total: 14.9s	remaining: 9.4s
128:	learn: 0.0146507	total: 14.9s	remaining: 9.27s
129:	learn: 0.0144968	total: 15.1s	remaining: 9.15s
130:	learn: 0.0143487	total: 15.1s	remaining: 9.01s
131:	learn: 0.0141565	total: 15.2s	remaining: 8.89s
132:	learn: 0.0140214	total: 15.4s	remaining: 8.77s
133:	learn: 0.0139054	total: 15.4s	remaining: 8.65s
134:	learn: 0.0137533	total: 15.5s	remaining: 8.52s
135:	learn: 0.0136299	total: 15.6s	remaining: 8.4s
136:	learn: 0.0135250	total: 15.7s	remaining: 8.27s
137:	learn: 0.0134010	total: 15.8s	remaining: 8.14s
138:	learn: 0.0132504	total: 15.9s	remaining: 8.02s
139:	learn: 0.0131811	total: 16s	remaining: 7.89s
140:	learn: 0.0130473	total: 16.1s	remaining: 7.76s
141:	learn: 0.0129475	total: 16.2s	remaining: 7.65s
142:	learn: 0.0128330	total: 16.3s	remaining: 7.53s
143:	learn: 0.0127218	total: 16.4s	remaining: 7.41s
144:	learn: 0.0126585	total: 16.5s	remaining: 7.29s
145:	learn: 0.0125674	total: 16.6s	remaining: 7.16s
146:	learn: 0.0124580	total: 16.7s	remaining: 7.04s
147:	learn: 0.0123486	total: 16.8s	remaining: 6.92s
148:	learn: 0.0122573	total: 16.9s	remaining: 6.79s
149:	learn: 0.0121493	total: 17s	remaining: 6.67s
150:	learn: 0.0120584	total: 17.1s	remaining: 6.55s
151:	learn: 0.0119999	total: 17.2s	remaining: 6.43s
152:	learn: 0.0118948	total: 17.2s	remaining: 6.31s
153:	learn: 0.0117879	total: 17.3s	remaining: 6.2s
154:	learn: 0.0116948	total: 17.4s	remaining: 6.08s
155:	learn: 0.0116035	total: 17.5s	remaining: 5.96s
156:	learn: 0.0115298	total: 17.6s	remaining: 5.84s
157:	learn: 0.0114372	total: 17.7s	remaining: 5.72s
158:	learn: 0.0113225	total: 17.8s	remaining: 5.6s
159:	learn: 0.0112566	total: 17.9s	remaining: 5.49s
160:	learn: 0.0111744	total: 18s	remaining: 5.36s
161:	learn: 0.0111018	total: 18.1s	remaining: 5.25s
162:	learn: 0.0109846	total: 18.2s	remaining: 5.13s
163:	learn: 0.0108699	total: 18.3s	remaining: 5.02s
164:	learn: 0.0107991	total: 18.4s	remaining: 4.9s
165:	learn: 0.0107309	total: 18.5s	remaining: 4.79s
166:	learn: 0.0106513	total: 18.6s	remaining: 4.67s
167:	learn: 0.0105929	total: 18.7s	remaining: 4.56s
168:	learn: 0.0105139	total: 18.8s	remaining: 4.45s
169:	learn: 0.0104321	total: 18.9s	remaining: 4.33s
170:	learn: 0.0103605	total: 19s	remaining: 4.21s
171:	learn: 0.0102320	total: 19.1s	remaining: 4.1s
172:	learn: 0.0101691	total: 19.1s	remaining: 3.98s
173:	learn: 0.0100881	total: 19.2s	remaining: 3.87s
174:	learn: 0.0100404	total: 19.3s	remaining: 3.76s
175:	learn: 0.0099761	total: 19.4s	remaining: 3.65s
176:	learn: 0.0099016	total: 19.5s	remaining: 3.53s
177:	learn: 0.0098252	total: 19.6s	remaining: 3.42s
178:	learn: 0.0097639	total: 19.7s	remaining: 3.31s
179:	learn: 0.0096861	total: 19.8s	remaining: 3.19s
180:	learn: 0.0096230	total: 19.9s	remaining: 3.08s
181:	learn: 0.0095457	total: 20s	remaining: 2.97s
182:	learn: 0.0094969	total: 20.1s	remaining: 2.85s
183:	learn: 0.0094546	total: 20.2s	remaining: 2.74s
184:	learn: 0.0093762	total: 20.3s	remaining: 2.63s
185:	learn: 0.0093122	total: 20.4s	remaining: 2.52s
186:	learn: 0.0092688	total: 20.5s	remaining: 2.41s
187:	learn: 0.0092050	total: 20.6s	remaining: 2.3s
188:	learn: 0.0091176	total: 20.6s	remaining: 2.18s
189:	learn: 0.0090455	total: 20.8s	remaining: 2.08s
190:	learn: 0.0089949	total: 20.8s	remaining: 1.96s
191:	learn: 0.0089196	total: 20.9s	remaining: 1.85s
192:	learn: 0.0088993	total: 21s	remaining: 1.74s
193:	learn: 0.0088168	total: 21.1s	remaining: 1.63s
194:	learn: 0.0087618	total: 21.2s	remaining: 1.52s
195:	learn: 0.0087062	total: 21.3s	remaining: 1.41s
196:	learn: 0.0086396	total: 21.4s	remaining: 1.3s
197:	learn: 0.0085638	total: 21.5s	remaining: 1.19s
198:	learn: 0.0085042	total: 21.6s	remaining: 1.08s
199:	learn: 0.0084571	total: 21.7s	remaining: 976ms
200:	learn: 0.0084040	total: 21.9s	remaining: 870ms
201:	learn: 0.0083269	total: 22s	remaining: 764ms
202:	learn: 0.0082812	total: 22.2s	remaining: 657ms
203:	learn: 0.0082183	total: 22.4s	remaining: 549ms
204:	learn: 0.0081655	total: 22.6s	remaining: 441ms
205:	learn: 0.0081081	total: 22.7s	remaining: 331ms
206:	learn: 0.0080751	total: 22.9s	remaining: 221ms
207:	learn: 0.0080236	total: 23.1s	remaining: 111ms
208:	learn: 0.0079921	total: 23.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.54
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.26
 - F1-Score_Train: 99.26
 - Precision_Test: 8.75
 - Recall_Test: 89.68
 - AUPRC_Test: 73.12
 - Accuracy_Test: 98.41
 - F1-Score_Test: 15.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 209
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.97
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6238656	total: 171ms	remaining: 35.6s
1:	learn: 0.5634402	total: 345ms	remaining: 35.7s
2:	learn: 0.5060996	total: 460ms	remaining: 31.6s
3:	learn: 0.4544470	total: 554ms	remaining: 28.4s
4:	learn: 0.4062524	total: 642ms	remaining: 26.2s
5:	learn: 0.3651915	total: 749ms	remaining: 25.3s
6:	learn: 0.3308862	total: 838ms	remaining: 24.2s
7:	learn: 0.2965649	total: 936ms	remaining: 23.5s
8:	learn: 0.2693099	total: 1.04s	remaining: 23.2s
9:	learn: 0.2502180	total: 1.16s	remaining: 23s
10:	learn: 0.2318437	total: 1.25s	remaining: 22.5s
11:	learn: 0.2135395	total: 1.35s	remaining: 22.1s
12:	learn: 0.2005868	total: 1.44s	remaining: 21.7s
13:	learn: 0.1878831	total: 1.53s	remaining: 21.4s
14:	learn: 0.1756766	total: 1.63s	remaining: 21s
15:	learn: 0.1660401	total: 1.71s	remaining: 20.6s
16:	learn: 0.1579338	total: 1.81s	remaining: 20.5s
17:	learn: 0.1491333	total: 1.91s	remaining: 20.3s
18:	learn: 0.1408790	total: 2s	remaining: 20s
19:	learn: 0.1338330	total: 2.1s	remaining: 19.8s
20:	learn: 0.1273409	total: 2.19s	remaining: 19.7s
21:	learn: 0.1212262	total: 2.28s	remaining: 19.4s
22:	learn: 0.1161037	total: 2.38s	remaining: 19.3s
23:	learn: 0.1105116	total: 2.47s	remaining: 19.1s
24:	learn: 0.1069200	total: 2.56s	remaining: 18.9s
25:	learn: 0.1023080	total: 2.66s	remaining: 18.8s
26:	learn: 0.1000193	total: 2.75s	remaining: 18.5s
27:	learn: 0.0960509	total: 2.84s	remaining: 18.4s
28:	learn: 0.0933428	total: 2.94s	remaining: 18.3s
29:	learn: 0.0893964	total: 3.04s	remaining: 18.2s
30:	learn: 0.0862330	total: 3.15s	remaining: 18.1s
31:	learn: 0.0833937	total: 3.26s	remaining: 18s
32:	learn: 0.0813729	total: 3.36s	remaining: 17.9s
33:	learn: 0.0787292	total: 3.46s	remaining: 17.8s
34:	learn: 0.0764068	total: 3.55s	remaining: 17.7s
35:	learn: 0.0747124	total: 3.65s	remaining: 17.6s
36:	learn: 0.0722911	total: 3.76s	remaining: 17.5s
37:	learn: 0.0712724	total: 3.84s	remaining: 17.3s
38:	learn: 0.0699521	total: 3.94s	remaining: 17.2s
39:	learn: 0.0689372	total: 4.04s	remaining: 17.1s
40:	learn: 0.0674998	total: 4.13s	remaining: 16.9s
41:	learn: 0.0662757	total: 4.24s	remaining: 16.9s
42:	learn: 0.0645879	total: 4.34s	remaining: 16.8s
43:	learn: 0.0635108	total: 4.43s	remaining: 16.6s
44:	learn: 0.0622687	total: 4.54s	remaining: 16.6s
45:	learn: 0.0610971	total: 4.63s	remaining: 16.4s
46:	learn: 0.0595510	total: 4.73s	remaining: 16.3s
47:	learn: 0.0583386	total: 4.84s	remaining: 16.2s
48:	learn: 0.0575145	total: 4.93s	remaining: 16.1s
49:	learn: 0.0562010	total: 5.03s	remaining: 16s
50:	learn: 0.0549755	total: 5.14s	remaining: 15.9s
51:	learn: 0.0541216	total: 5.23s	remaining: 15.8s
52:	learn: 0.0532933	total: 5.34s	remaining: 15.7s
53:	learn: 0.0523387	total: 5.44s	remaining: 15.6s
54:	learn: 0.0514654	total: 5.53s	remaining: 15.5s
55:	learn: 0.0506401	total: 5.63s	remaining: 15.4s
56:	learn: 0.0497817	total: 5.74s	remaining: 15.3s
57:	learn: 0.0488297	total: 5.82s	remaining: 15.2s
58:	learn: 0.0480554	total: 5.92s	remaining: 15s
59:	learn: 0.0472571	total: 6.02s	remaining: 14.9s
60:	learn: 0.0465065	total: 6.11s	remaining: 14.8s
61:	learn: 0.0456434	total: 6.21s	remaining: 14.7s
62:	learn: 0.0447544	total: 6.34s	remaining: 14.7s
63:	learn: 0.0442063	total: 6.43s	remaining: 14.6s
64:	learn: 0.0435604	total: 6.52s	remaining: 14.5s
65:	learn: 0.0430044	total: 6.63s	remaining: 14.4s
66:	learn: 0.0422429	total: 6.71s	remaining: 14.2s
67:	learn: 0.0417491	total: 6.8s	remaining: 14.1s
68:	learn: 0.0412647	total: 6.91s	remaining: 14s
69:	learn: 0.0408246	total: 7s	remaining: 13.9s
70:	learn: 0.0403175	total: 7.09s	remaining: 13.8s
71:	learn: 0.0396280	total: 7.19s	remaining: 13.7s
72:	learn: 0.0392052	total: 7.28s	remaining: 13.6s
73:	learn: 0.0386465	total: 7.39s	remaining: 13.5s
74:	learn: 0.0380448	total: 7.49s	remaining: 13.4s
75:	learn: 0.0376379	total: 7.58s	remaining: 13.3s
76:	learn: 0.0370674	total: 7.67s	remaining: 13.1s
77:	learn: 0.0365050	total: 7.77s	remaining: 13.1s
78:	learn: 0.0361091	total: 7.86s	remaining: 12.9s
79:	learn: 0.0357594	total: 7.96s	remaining: 12.8s
80:	learn: 0.0352683	total: 8.07s	remaining: 12.7s
81:	learn: 0.0348946	total: 8.15s	remaining: 12.6s
82:	learn: 0.0342011	total: 8.25s	remaining: 12.5s
83:	learn: 0.0338495	total: 8.37s	remaining: 12.5s
84:	learn: 0.0333872	total: 8.47s	remaining: 12.4s
85:	learn: 0.0331015	total: 8.55s	remaining: 12.2s
86:	learn: 0.0326384	total: 8.66s	remaining: 12.1s
87:	learn: 0.0321397	total: 8.75s	remaining: 12s
88:	learn: 0.0318127	total: 8.84s	remaining: 11.9s
89:	learn: 0.0314625	total: 8.93s	remaining: 11.8s
90:	learn: 0.0311948	total: 9.02s	remaining: 11.7s
91:	learn: 0.0309287	total: 9.11s	remaining: 11.6s
92:	learn: 0.0304729	total: 9.22s	remaining: 11.5s
93:	learn: 0.0301405	total: 9.32s	remaining: 11.4s
94:	learn: 0.0299026	total: 9.42s	remaining: 11.3s
95:	learn: 0.0296769	total: 9.52s	remaining: 11.2s
96:	learn: 0.0294869	total: 9.6s	remaining: 11.1s
97:	learn: 0.0291897	total: 9.69s	remaining: 11s
98:	learn: 0.0287370	total: 9.81s	remaining: 10.9s
99:	learn: 0.0285019	total: 9.9s	remaining: 10.8s
100:	learn: 0.0282780	total: 9.99s	remaining: 10.7s
101:	learn: 0.0280248	total: 10.1s	remaining: 10.6s
102:	learn: 0.0278098	total: 10.2s	remaining: 10.5s
103:	learn: 0.0275109	total: 10.2s	remaining: 10.3s
104:	learn: 0.0271922	total: 10.4s	remaining: 10.3s
105:	learn: 0.0269367	total: 10.5s	remaining: 10.2s
106:	learn: 0.0265794	total: 10.7s	remaining: 10.2s
107:	learn: 0.0263008	total: 10.9s	remaining: 10.2s
108:	learn: 0.0259798	total: 11.1s	remaining: 10.2s
109:	learn: 0.0257680	total: 11.2s	remaining: 10.1s
110:	learn: 0.0254839	total: 11.4s	remaining: 10.1s
111:	learn: 0.0252160	total: 11.6s	remaining: 10s
112:	learn: 0.0249308	total: 11.7s	remaining: 9.96s
113:	learn: 0.0247751	total: 11.9s	remaining: 9.91s
114:	learn: 0.0244063	total: 12.1s	remaining: 9.87s
115:	learn: 0.0242336	total: 12.2s	remaining: 9.81s
116:	learn: 0.0239469	total: 12.4s	remaining: 9.76s
117:	learn: 0.0236888	total: 12.6s	remaining: 9.72s
118:	learn: 0.0233957	total: 12.8s	remaining: 9.68s
119:	learn: 0.0232231	total: 13s	remaining: 9.62s
120:	learn: 0.0230154	total: 13.2s	remaining: 9.57s
121:	learn: 0.0227529	total: 13.4s	remaining: 9.52s
122:	learn: 0.0224679	total: 13.5s	remaining: 9.47s
123:	learn: 0.0223118	total: 13.7s	remaining: 9.4s
124:	learn: 0.0220961	total: 13.9s	remaining: 9.32s
125:	learn: 0.0219460	total: 14s	remaining: 9.25s
126:	learn: 0.0217900	total: 14.2s	remaining: 9.16s
127:	learn: 0.0216894	total: 14.4s	remaining: 9.08s
128:	learn: 0.0214912	total: 14.5s	remaining: 9.01s
129:	learn: 0.0213407	total: 14.7s	remaining: 8.95s
130:	learn: 0.0211952	total: 14.9s	remaining: 8.87s
131:	learn: 0.0210791	total: 15.1s	remaining: 8.79s
132:	learn: 0.0209778	total: 15.2s	remaining: 8.71s
133:	learn: 0.0207652	total: 15.4s	remaining: 8.63s
134:	learn: 0.0206368	total: 15.6s	remaining: 8.56s
135:	learn: 0.0204915	total: 15.8s	remaining: 8.47s
136:	learn: 0.0202971	total: 15.9s	remaining: 8.38s
137:	learn: 0.0201856	total: 16s	remaining: 8.25s
138:	learn: 0.0199991	total: 16.1s	remaining: 8.12s
139:	learn: 0.0197453	total: 16.2s	remaining: 8s
140:	learn: 0.0195899	total: 16.3s	remaining: 7.87s
141:	learn: 0.0194816	total: 16.4s	remaining: 7.74s
142:	learn: 0.0193025	total: 16.5s	remaining: 7.62s
143:	learn: 0.0191148	total: 16.6s	remaining: 7.48s
144:	learn: 0.0189689	total: 16.7s	remaining: 7.36s
145:	learn: 0.0188176	total: 16.8s	remaining: 7.24s
146:	learn: 0.0186167	total: 16.9s	remaining: 7.11s
147:	learn: 0.0184393	total: 17s	remaining: 6.99s
148:	learn: 0.0182828	total: 17.1s	remaining: 6.87s
149:	learn: 0.0181192	total: 17.2s	remaining: 6.75s
150:	learn: 0.0179944	total: 17.2s	remaining: 6.62s
151:	learn: 0.0178583	total: 17.3s	remaining: 6.5s
152:	learn: 0.0177311	total: 17.4s	remaining: 6.37s
153:	learn: 0.0176278	total: 17.5s	remaining: 6.25s
154:	learn: 0.0174794	total: 17.6s	remaining: 6.13s
155:	learn: 0.0173301	total: 17.7s	remaining: 6.01s
156:	learn: 0.0171957	total: 17.8s	remaining: 5.89s
157:	learn: 0.0170404	total: 17.9s	remaining: 5.77s
158:	learn: 0.0169327	total: 18s	remaining: 5.65s
159:	learn: 0.0167600	total: 18.1s	remaining: 5.53s
160:	learn: 0.0166131	total: 18.2s	remaining: 5.42s
161:	learn: 0.0164677	total: 18.3s	remaining: 5.3s
162:	learn: 0.0163489	total: 18.4s	remaining: 5.18s
163:	learn: 0.0161672	total: 18.4s	remaining: 5.06s
164:	learn: 0.0160771	total: 18.5s	remaining: 4.94s
165:	learn: 0.0159236	total: 18.6s	remaining: 4.82s
166:	learn: 0.0158590	total: 18.7s	remaining: 4.71s
167:	learn: 0.0157314	total: 18.8s	remaining: 4.59s
168:	learn: 0.0155666	total: 18.9s	remaining: 4.48s
169:	learn: 0.0154682	total: 19s	remaining: 4.36s
170:	learn: 0.0153743	total: 19.1s	remaining: 4.24s
171:	learn: 0.0152266	total: 19.2s	remaining: 4.13s
172:	learn: 0.0150738	total: 19.3s	remaining: 4.01s
173:	learn: 0.0149566	total: 19.4s	remaining: 3.9s
174:	learn: 0.0148720	total: 19.5s	remaining: 3.78s
175:	learn: 0.0147626	total: 19.6s	remaining: 3.67s
176:	learn: 0.0147022	total: 19.6s	remaining: 3.55s
177:	learn: 0.0145671	total: 19.8s	remaining: 3.44s
178:	learn: 0.0145099	total: 19.9s	remaining: 3.33s
179:	learn: 0.0143644	total: 20s	remaining: 3.21s
180:	learn: 0.0142822	total: 20.1s	remaining: 3.1s
181:	learn: 0.0142179	total: 20.1s	remaining: 2.99s
182:	learn: 0.0140750	total: 20.2s	remaining: 2.88s
183:	learn: 0.0139873	total: 20.3s	remaining: 2.76s
184:	learn: 0.0139067	total: 20.4s	remaining: 2.65s
185:	learn: 0.0137791	total: 20.5s	remaining: 2.54s
186:	learn: 0.0136904	total: 20.6s	remaining: 2.42s
187:	learn: 0.0136024	total: 20.7s	remaining: 2.31s
188:	learn: 0.0135193	total: 20.8s	remaining: 2.2s
189:	learn: 0.0133877	total: 20.9s	remaining: 2.09s
190:	learn: 0.0132891	total: 21s	remaining: 1.98s
191:	learn: 0.0131926	total: 21.1s	remaining: 1.87s
192:	learn: 0.0131337	total: 21.2s	remaining: 1.75s
193:	learn: 0.0130876	total: 21.3s	remaining: 1.64s
194:	learn: 0.0129490	total: 21.4s	remaining: 1.53s
195:	learn: 0.0128547	total: 21.4s	remaining: 1.42s
196:	learn: 0.0127441	total: 21.5s	remaining: 1.31s
197:	learn: 0.0126550	total: 21.6s	remaining: 1.2s
198:	learn: 0.0125764	total: 21.7s	remaining: 1.09s
199:	learn: 0.0124944	total: 21.8s	remaining: 982ms
200:	learn: 0.0123930	total: 22s	remaining: 874ms
201:	learn: 0.0123097	total: 22s	remaining: 764ms
202:	learn: 0.0122249	total: 22.1s	remaining: 654ms
203:	learn: 0.0121172	total: 22.3s	remaining: 546ms
204:	learn: 0.0120049	total: 22.3s	remaining: 436ms
205:	learn: 0.0119399	total: 22.4s	remaining: 327ms
206:	learn: 0.0118755	total: 22.5s	remaining: 218ms
207:	learn: 0.0117875	total: 22.6s	remaining: 109ms
208:	learn: 0.0117374	total: 22.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.14
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.05
 - F1-Score_Train: 99.06
 - Precision_Test: 7.52
 - Recall_Test: 92.06
 - AUPRC_Test: 72.94
 - Accuracy_Test: 98.08
 - F1-Score_Test: 13.90
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 209
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.97
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6228610	total: 86.3ms	remaining: 18s
1:	learn: 0.5589350	total: 195ms	remaining: 20.2s
2:	learn: 0.4945927	total: 287ms	remaining: 19.7s
3:	learn: 0.4415883	total: 386ms	remaining: 19.8s
4:	learn: 0.3994444	total: 485ms	remaining: 19.8s
5:	learn: 0.3585145	total: 589ms	remaining: 19.9s
6:	learn: 0.3252102	total: 679ms	remaining: 19.6s
7:	learn: 0.2993215	total: 759ms	remaining: 19.1s
8:	learn: 0.2728274	total: 865ms	remaining: 19.2s
9:	learn: 0.2527176	total: 951ms	remaining: 18.9s
10:	learn: 0.2271111	total: 1.04s	remaining: 18.7s
11:	learn: 0.2097805	total: 1.21s	remaining: 19.8s
12:	learn: 0.1904325	total: 1.36s	remaining: 20.5s
13:	learn: 0.1749892	total: 1.52s	remaining: 21.2s
14:	learn: 0.1644903	total: 1.67s	remaining: 21.6s
15:	learn: 0.1552958	total: 1.84s	remaining: 22.2s
16:	learn: 0.1472892	total: 2.01s	remaining: 22.7s
17:	learn: 0.1397910	total: 2.16s	remaining: 22.9s
18:	learn: 0.1323394	total: 2.34s	remaining: 23.4s
19:	learn: 0.1239333	total: 2.54s	remaining: 24s
20:	learn: 0.1159016	total: 2.71s	remaining: 24.3s
21:	learn: 0.1107358	total: 2.88s	remaining: 24.5s
22:	learn: 0.1062710	total: 3.03s	remaining: 24.5s
23:	learn: 0.1017281	total: 3.19s	remaining: 24.6s
24:	learn: 0.0971998	total: 3.36s	remaining: 24.7s
25:	learn: 0.0940908	total: 3.53s	remaining: 24.9s
26:	learn: 0.0907421	total: 3.71s	remaining: 25s
27:	learn: 0.0876139	total: 3.88s	remaining: 25.1s
28:	learn: 0.0850040	total: 4.04s	remaining: 25.1s
29:	learn: 0.0819697	total: 4.25s	remaining: 25.4s
30:	learn: 0.0787525	total: 4.42s	remaining: 25.4s
31:	learn: 0.0767960	total: 4.58s	remaining: 25.3s
32:	learn: 0.0747608	total: 4.75s	remaining: 25.3s
33:	learn: 0.0730806	total: 4.92s	remaining: 25.3s
34:	learn: 0.0705730	total: 5.1s	remaining: 25.3s
35:	learn: 0.0678135	total: 5.3s	remaining: 25.5s
36:	learn: 0.0656973	total: 5.48s	remaining: 25.5s
37:	learn: 0.0643127	total: 5.67s	remaining: 25.5s
38:	learn: 0.0632480	total: 5.83s	remaining: 25.4s
39:	learn: 0.0619043	total: 6.01s	remaining: 25.4s
40:	learn: 0.0603709	total: 6.19s	remaining: 25.4s
41:	learn: 0.0593553	total: 6.36s	remaining: 25.3s
42:	learn: 0.0575641	total: 6.55s	remaining: 25.3s
43:	learn: 0.0561168	total: 6.72s	remaining: 25.2s
44:	learn: 0.0545372	total: 6.82s	remaining: 24.8s
45:	learn: 0.0531274	total: 6.91s	remaining: 24.5s
46:	learn: 0.0521382	total: 7.01s	remaining: 24.2s
47:	learn: 0.0510777	total: 7.11s	remaining: 23.8s
48:	learn: 0.0500266	total: 7.2s	remaining: 23.5s
49:	learn: 0.0488867	total: 7.3s	remaining: 23.2s
50:	learn: 0.0477620	total: 7.4s	remaining: 22.9s
51:	learn: 0.0468403	total: 7.5s	remaining: 22.6s
52:	learn: 0.0461513	total: 7.61s	remaining: 22.4s
53:	learn: 0.0456671	total: 7.68s	remaining: 22.1s
54:	learn: 0.0450026	total: 7.76s	remaining: 21.7s
55:	learn: 0.0444068	total: 7.86s	remaining: 21.5s
56:	learn: 0.0438466	total: 7.94s	remaining: 21.2s
57:	learn: 0.0432862	total: 8.03s	remaining: 20.9s
58:	learn: 0.0427898	total: 8.13s	remaining: 20.7s
59:	learn: 0.0420710	total: 8.23s	remaining: 20.4s
60:	learn: 0.0412616	total: 8.31s	remaining: 20.2s
61:	learn: 0.0404560	total: 8.42s	remaining: 20s
62:	learn: 0.0400367	total: 8.51s	remaining: 19.7s
63:	learn: 0.0394936	total: 8.61s	remaining: 19.5s
64:	learn: 0.0386128	total: 8.71s	remaining: 19.3s
65:	learn: 0.0380738	total: 8.8s	remaining: 19.1s
66:	learn: 0.0376512	total: 8.88s	remaining: 18.8s
67:	learn: 0.0370196	total: 8.99s	remaining: 18.7s
68:	learn: 0.0364906	total: 9.1s	remaining: 18.5s
69:	learn: 0.0358475	total: 9.19s	remaining: 18.3s
70:	learn: 0.0353576	total: 9.28s	remaining: 18s
71:	learn: 0.0347784	total: 9.37s	remaining: 17.8s
72:	learn: 0.0342824	total: 9.47s	remaining: 17.6s
73:	learn: 0.0339246	total: 9.59s	remaining: 17.5s
74:	learn: 0.0336574	total: 9.67s	remaining: 17.3s
75:	learn: 0.0333979	total: 9.74s	remaining: 17.1s
76:	learn: 0.0328394	total: 9.86s	remaining: 16.9s
77:	learn: 0.0325319	total: 9.94s	remaining: 16.7s
78:	learn: 0.0319361	total: 10s	remaining: 16.5s
79:	learn: 0.0315584	total: 10.1s	remaining: 16.3s
80:	learn: 0.0310664	total: 10.2s	remaining: 16.2s
81:	learn: 0.0306897	total: 10.3s	remaining: 16s
82:	learn: 0.0302174	total: 10.4s	remaining: 15.8s
83:	learn: 0.0299199	total: 10.5s	remaining: 15.7s
84:	learn: 0.0294628	total: 10.6s	remaining: 15.5s
85:	learn: 0.0290304	total: 10.7s	remaining: 15.3s
86:	learn: 0.0286818	total: 10.8s	remaining: 15.2s
87:	learn: 0.0283077	total: 10.9s	remaining: 15s
88:	learn: 0.0278849	total: 11s	remaining: 14.9s
89:	learn: 0.0275273	total: 11.1s	remaining: 14.7s
90:	learn: 0.0271770	total: 11.2s	remaining: 14.5s
91:	learn: 0.0269265	total: 11.3s	remaining: 14.4s
92:	learn: 0.0266411	total: 11.4s	remaining: 14.2s
93:	learn: 0.0262475	total: 11.5s	remaining: 14.1s
94:	learn: 0.0260412	total: 11.6s	remaining: 13.9s
95:	learn: 0.0257752	total: 11.7s	remaining: 13.8s
96:	learn: 0.0255435	total: 11.8s	remaining: 13.6s
97:	learn: 0.0252829	total: 11.9s	remaining: 13.5s
98:	learn: 0.0250189	total: 12s	remaining: 13.3s
99:	learn: 0.0248153	total: 12.1s	remaining: 13.1s
100:	learn: 0.0245924	total: 12.2s	remaining: 13s
101:	learn: 0.0242323	total: 12.3s	remaining: 12.9s
102:	learn: 0.0239988	total: 12.4s	remaining: 12.7s
103:	learn: 0.0237785	total: 12.5s	remaining: 12.6s
104:	learn: 0.0235156	total: 12.6s	remaining: 12.4s
105:	learn: 0.0233002	total: 12.7s	remaining: 12.3s
106:	learn: 0.0230800	total: 12.8s	remaining: 12.2s
107:	learn: 0.0229215	total: 12.9s	remaining: 12s
108:	learn: 0.0226605	total: 12.9s	remaining: 11.9s
109:	learn: 0.0224359	total: 13.1s	remaining: 11.7s
110:	learn: 0.0222181	total: 13.1s	remaining: 11.6s
111:	learn: 0.0220720	total: 13.2s	remaining: 11.5s
112:	learn: 0.0218224	total: 13.3s	remaining: 11.3s
113:	learn: 0.0216226	total: 13.4s	remaining: 11.2s
114:	learn: 0.0214913	total: 13.5s	remaining: 11s
115:	learn: 0.0212664	total: 13.6s	remaining: 10.9s
116:	learn: 0.0211304	total: 13.7s	remaining: 10.8s
117:	learn: 0.0209929	total: 13.8s	remaining: 10.6s
118:	learn: 0.0207640	total: 13.9s	remaining: 10.5s
119:	learn: 0.0205061	total: 14s	remaining: 10.4s
120:	learn: 0.0203325	total: 14.1s	remaining: 10.2s
121:	learn: 0.0200983	total: 14.2s	remaining: 10.1s
122:	learn: 0.0198642	total: 14.3s	remaining: 9.99s
123:	learn: 0.0197444	total: 14.4s	remaining: 9.85s
124:	learn: 0.0196458	total: 14.5s	remaining: 9.72s
125:	learn: 0.0194694	total: 14.6s	remaining: 9.59s
126:	learn: 0.0193245	total: 14.6s	remaining: 9.46s
127:	learn: 0.0191465	total: 14.8s	remaining: 9.34s
128:	learn: 0.0189324	total: 14.9s	remaining: 9.21s
129:	learn: 0.0187256	total: 14.9s	remaining: 9.08s
130:	learn: 0.0186223	total: 15s	remaining: 8.96s
131:	learn: 0.0184023	total: 15.1s	remaining: 8.83s
132:	learn: 0.0182190	total: 15.2s	remaining: 8.7s
133:	learn: 0.0180805	total: 15.3s	remaining: 8.59s
134:	learn: 0.0179689	total: 15.4s	remaining: 8.46s
135:	learn: 0.0178576	total: 15.5s	remaining: 8.33s
136:	learn: 0.0176919	total: 15.6s	remaining: 8.21s
137:	learn: 0.0174399	total: 15.7s	remaining: 8.09s
138:	learn: 0.0173316	total: 15.8s	remaining: 7.97s
139:	learn: 0.0171803	total: 15.9s	remaining: 7.85s
140:	learn: 0.0169789	total: 16s	remaining: 7.74s
141:	learn: 0.0168774	total: 16.1s	remaining: 7.61s
142:	learn: 0.0167773	total: 16.2s	remaining: 7.49s
143:	learn: 0.0166250	total: 16.3s	remaining: 7.36s
144:	learn: 0.0164495	total: 16.4s	remaining: 7.24s
145:	learn: 0.0162950	total: 16.5s	remaining: 7.13s
146:	learn: 0.0162141	total: 16.6s	remaining: 7s
147:	learn: 0.0160527	total: 16.7s	remaining: 6.89s
148:	learn: 0.0159174	total: 16.9s	remaining: 6.79s
149:	learn: 0.0157385	total: 17s	remaining: 6.69s
150:	learn: 0.0155881	total: 17.2s	remaining: 6.61s
151:	learn: 0.0154891	total: 17.4s	remaining: 6.51s
152:	learn: 0.0153517	total: 17.5s	remaining: 6.42s
153:	learn: 0.0152684	total: 17.7s	remaining: 6.33s
154:	learn: 0.0151602	total: 17.9s	remaining: 6.23s
155:	learn: 0.0150954	total: 18s	remaining: 6.12s
156:	learn: 0.0149550	total: 18.2s	remaining: 6.03s
157:	learn: 0.0148276	total: 18.4s	remaining: 5.93s
158:	learn: 0.0147808	total: 18.5s	remaining: 5.82s
159:	learn: 0.0146703	total: 18.7s	remaining: 5.72s
160:	learn: 0.0145622	total: 18.8s	remaining: 5.62s
161:	learn: 0.0143942	total: 19s	remaining: 5.52s
162:	learn: 0.0143183	total: 19.2s	remaining: 5.42s
163:	learn: 0.0142355	total: 19.4s	remaining: 5.31s
164:	learn: 0.0141177	total: 19.6s	remaining: 5.21s
165:	learn: 0.0139912	total: 19.8s	remaining: 5.12s
166:	learn: 0.0139303	total: 19.9s	remaining: 5s
167:	learn: 0.0138115	total: 20.1s	remaining: 4.9s
168:	learn: 0.0137339	total: 20.2s	remaining: 4.79s
169:	learn: 0.0136374	total: 20.4s	remaining: 4.69s
170:	learn: 0.0135821	total: 20.6s	remaining: 4.57s
171:	learn: 0.0134598	total: 20.8s	remaining: 4.47s
172:	learn: 0.0133760	total: 20.9s	remaining: 4.36s
173:	learn: 0.0133244	total: 21.1s	remaining: 4.25s
174:	learn: 0.0132188	total: 21.3s	remaining: 4.14s
175:	learn: 0.0130829	total: 21.5s	remaining: 4.03s
176:	learn: 0.0130222	total: 21.7s	remaining: 3.92s
177:	learn: 0.0129174	total: 21.9s	remaining: 3.81s
178:	learn: 0.0128808	total: 22s	remaining: 3.69s
179:	learn: 0.0127836	total: 22.2s	remaining: 3.58s
180:	learn: 0.0127057	total: 22.4s	remaining: 3.46s
181:	learn: 0.0126078	total: 22.5s	remaining: 3.33s
182:	learn: 0.0125490	total: 22.6s	remaining: 3.21s
183:	learn: 0.0124565	total: 22.7s	remaining: 3.08s
184:	learn: 0.0123602	total: 22.8s	remaining: 2.95s
185:	learn: 0.0122630	total: 22.9s	remaining: 2.83s
186:	learn: 0.0121980	total: 22.9s	remaining: 2.7s
187:	learn: 0.0120767	total: 23.1s	remaining: 2.58s
188:	learn: 0.0120244	total: 23.2s	remaining: 2.45s
189:	learn: 0.0119270	total: 23.3s	remaining: 2.33s
190:	learn: 0.0117998	total: 23.3s	remaining: 2.2s
191:	learn: 0.0116942	total: 23.4s	remaining: 2.08s
192:	learn: 0.0116381	total: 23.5s	remaining: 1.95s
193:	learn: 0.0115642	total: 23.6s	remaining: 1.82s
194:	learn: 0.0115100	total: 23.7s	remaining: 1.7s
195:	learn: 0.0114276	total: 23.8s	remaining: 1.58s
196:	learn: 0.0113263	total: 23.9s	remaining: 1.46s
197:	learn: 0.0112808	total: 24s	remaining: 1.33s
198:	learn: 0.0111920	total: 24.1s	remaining: 1.21s
199:	learn: 0.0111233	total: 24.2s	remaining: 1.09s
200:	learn: 0.0110733	total: 24.3s	remaining: 967ms
201:	learn: 0.0109776	total: 24.4s	remaining: 845ms
202:	learn: 0.0109197	total: 24.5s	remaining: 724ms
203:	learn: 0.0108327	total: 24.6s	remaining: 602ms
204:	learn: 0.0107411	total: 24.7s	remaining: 481ms
205:	learn: 0.0106502	total: 24.8s	remaining: 361ms
206:	learn: 0.0105448	total: 24.9s	remaining: 240ms
207:	learn: 0.0104799	total: 25s	remaining: 120ms
208:	learn: 0.0103816	total: 25.1s	remaining: 0us
[I 2024-12-19 14:25:41,643] Trial 16 finished with value: 72.7388417743842 and parameters: {'learning_rate': 0.026658237113297848, 'max_depth': 6, 'n_estimators': 209, 'scale_pos_weight': 8.96907757599555}. Best is trial 4 with value: 76.71888513333191.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.09
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.03
 - F1-Score_Train: 99.04
 - Precision_Test: 7.14
 - Recall_Test: 88.89
 - AUPRC_Test: 72.16
 - Accuracy_Test: 98.04
 - F1-Score_Test: 13.22
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 209
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 8.97
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 72.7388

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5145279	total: 93.1ms	remaining: 23.5s
1:	learn: 0.3759899	total: 185ms	remaining: 23.2s
2:	learn: 0.2829656	total: 266ms	remaining: 22.2s
3:	learn: 0.2256432	total: 360ms	remaining: 22.4s
4:	learn: 0.1832321	total: 439ms	remaining: 21.8s
5:	learn: 0.1568421	total: 520ms	remaining: 21.4s
6:	learn: 0.1276152	total: 620ms	remaining: 21.8s
7:	learn: 0.1085092	total: 702ms	remaining: 21.5s
8:	learn: 0.0988769	total: 777ms	remaining: 21.1s
9:	learn: 0.0904006	total: 867ms	remaining: 21.1s
10:	learn: 0.0833496	total: 946ms	remaining: 20.8s
11:	learn: 0.0777337	total: 1.03s	remaining: 20.7s
12:	learn: 0.0701870	total: 1.13s	remaining: 20.9s
13:	learn: 0.0654795	total: 1.21s	remaining: 20.7s
14:	learn: 0.0604534	total: 1.29s	remaining: 20.5s
15:	learn: 0.0570661	total: 1.39s	remaining: 20.6s
16:	learn: 0.0529990	total: 1.47s	remaining: 20.4s
17:	learn: 0.0500548	total: 1.55s	remaining: 20.3s
18:	learn: 0.0480873	total: 1.64s	remaining: 20.2s
19:	learn: 0.0461743	total: 1.71s	remaining: 19.9s
20:	learn: 0.0447993	total: 1.81s	remaining: 20s
21:	learn: 0.0429288	total: 1.9s	remaining: 19.9s
22:	learn: 0.0418560	total: 1.98s	remaining: 19.8s
23:	learn: 0.0408794	total: 2.06s	remaining: 19.6s
24:	learn: 0.0399112	total: 2.16s	remaining: 19.7s
25:	learn: 0.0385314	total: 2.24s	remaining: 19.5s
26:	learn: 0.0373261	total: 2.32s	remaining: 19.4s
27:	learn: 0.0360699	total: 2.41s	remaining: 19.3s
28:	learn: 0.0349713	total: 2.49s	remaining: 19.2s
29:	learn: 0.0340923	total: 2.57s	remaining: 19.1s
30:	learn: 0.0332600	total: 2.66s	remaining: 19s
31:	learn: 0.0322512	total: 2.74s	remaining: 18.9s
32:	learn: 0.0312634	total: 2.83s	remaining: 18.9s
33:	learn: 0.0304927	total: 2.92s	remaining: 18.8s
34:	learn: 0.0299192	total: 2.99s	remaining: 18.6s
35:	learn: 0.0292661	total: 3.07s	remaining: 18.5s
36:	learn: 0.0285966	total: 3.17s	remaining: 18.5s
37:	learn: 0.0278839	total: 3.26s	remaining: 18.4s
38:	learn: 0.0270659	total: 3.34s	remaining: 18.3s
39:	learn: 0.0264538	total: 3.43s	remaining: 18.3s
40:	learn: 0.0258212	total: 3.52s	remaining: 18.2s
41:	learn: 0.0253009	total: 3.6s	remaining: 18.1s
42:	learn: 0.0248090	total: 3.69s	remaining: 18s
43:	learn: 0.0240402	total: 3.77s	remaining: 17.9s
44:	learn: 0.0236072	total: 3.84s	remaining: 17.7s
45:	learn: 0.0230519	total: 3.93s	remaining: 17.7s
46:	learn: 0.0225238	total: 4.01s	remaining: 17.6s
47:	learn: 0.0220148	total: 4.08s	remaining: 17.4s
48:	learn: 0.0215009	total: 4.17s	remaining: 17.4s
49:	learn: 0.0211815	total: 4.26s	remaining: 17.3s
50:	learn: 0.0204252	total: 4.34s	remaining: 17.2s
51:	learn: 0.0200110	total: 4.43s	remaining: 17.1s
52:	learn: 0.0196518	total: 4.51s	remaining: 17s
53:	learn: 0.0191626	total: 4.59s	remaining: 16.9s
54:	learn: 0.0187827	total: 4.68s	remaining: 16.8s
55:	learn: 0.0183137	total: 4.77s	remaining: 16.8s
56:	learn: 0.0179120	total: 4.84s	remaining: 16.6s
57:	learn: 0.0175654	total: 4.94s	remaining: 16.6s
58:	learn: 0.0171260	total: 5.03s	remaining: 16.5s
59:	learn: 0.0168565	total: 5.11s	remaining: 16.4s
60:	learn: 0.0164102	total: 5.26s	remaining: 16.6s
61:	learn: 0.0161096	total: 5.39s	remaining: 16.6s
62:	learn: 0.0158980	total: 5.56s	remaining: 16.8s
63:	learn: 0.0156448	total: 5.71s	remaining: 16.9s
64:	learn: 0.0152779	total: 5.9s	remaining: 17.1s
65:	learn: 0.0149732	total: 6.05s	remaining: 17.1s
66:	learn: 0.0147757	total: 6.2s	remaining: 17.2s
67:	learn: 0.0145178	total: 6.36s	remaining: 17.3s
68:	learn: 0.0143185	total: 6.53s	remaining: 17.4s
69:	learn: 0.0139522	total: 6.69s	remaining: 17.5s
70:	learn: 0.0137313	total: 6.84s	remaining: 17.5s
71:	learn: 0.0135647	total: 6.97s	remaining: 17.5s
72:	learn: 0.0133906	total: 7.12s	remaining: 17.6s
73:	learn: 0.0131384	total: 7.27s	remaining: 17.6s
74:	learn: 0.0129585	total: 7.43s	remaining: 17.6s
75:	learn: 0.0127659	total: 7.58s	remaining: 17.6s
76:	learn: 0.0125599	total: 7.74s	remaining: 17.7s
77:	learn: 0.0123019	total: 7.91s	remaining: 17.7s
78:	learn: 0.0121502	total: 8.08s	remaining: 17.8s
79:	learn: 0.0119081	total: 8.24s	remaining: 17.8s
80:	learn: 0.0117277	total: 8.42s	remaining: 17.9s
81:	learn: 0.0115926	total: 8.56s	remaining: 17.9s
82:	learn: 0.0114909	total: 8.7s	remaining: 17.8s
83:	learn: 0.0113670	total: 8.85s	remaining: 17.8s
84:	learn: 0.0112138	total: 9.02s	remaining: 17.8s
85:	learn: 0.0111199	total: 9.16s	remaining: 17.8s
86:	learn: 0.0109202	total: 9.32s	remaining: 17.8s
87:	learn: 0.0108341	total: 9.48s	remaining: 17.8s
88:	learn: 0.0106613	total: 9.64s	remaining: 17.8s
89:	learn: 0.0104350	total: 9.79s	remaining: 17.7s
90:	learn: 0.0102675	total: 9.96s	remaining: 17.7s
91:	learn: 0.0101174	total: 10.1s	remaining: 17.7s
92:	learn: 0.0099352	total: 10.3s	remaining: 17.7s
93:	learn: 0.0097054	total: 10.4s	remaining: 17.7s
94:	learn: 0.0095721	total: 10.6s	remaining: 17.6s
95:	learn: 0.0094901	total: 10.7s	remaining: 17.5s
96:	learn: 0.0094108	total: 10.8s	remaining: 17.3s
97:	learn: 0.0093568	total: 10.9s	remaining: 17.2s
98:	learn: 0.0092736	total: 10.9s	remaining: 17s
99:	learn: 0.0091471	total: 11s	remaining: 16.9s
100:	learn: 0.0090427	total: 11.1s	remaining: 16.7s
101:	learn: 0.0088490	total: 11.2s	remaining: 16.6s
102:	learn: 0.0087533	total: 11.3s	remaining: 16.4s
103:	learn: 0.0086608	total: 11.4s	remaining: 16.3s
104:	learn: 0.0085877	total: 11.4s	remaining: 16.1s
105:	learn: 0.0084967	total: 11.5s	remaining: 16s
106:	learn: 0.0084366	total: 11.6s	remaining: 15.9s
107:	learn: 0.0083143	total: 11.7s	remaining: 15.7s
108:	learn: 0.0082317	total: 11.8s	remaining: 15.6s
109:	learn: 0.0080842	total: 11.9s	remaining: 15.5s
110:	learn: 0.0080304	total: 12s	remaining: 15.3s
111:	learn: 0.0079612	total: 12s	remaining: 15.2s
112:	learn: 0.0078183	total: 12.1s	remaining: 15s
113:	learn: 0.0077054	total: 12.2s	remaining: 14.9s
114:	learn: 0.0076074	total: 12.3s	remaining: 14.8s
115:	learn: 0.0075640	total: 12.4s	remaining: 14.6s
116:	learn: 0.0075065	total: 12.5s	remaining: 14.5s
117:	learn: 0.0074027	total: 12.6s	remaining: 14.4s
118:	learn: 0.0073158	total: 12.7s	remaining: 14.2s
119:	learn: 0.0072737	total: 12.7s	remaining: 14.1s
120:	learn: 0.0072226	total: 12.8s	remaining: 14s
121:	learn: 0.0071981	total: 12.9s	remaining: 13.8s
122:	learn: 0.0071234	total: 13s	remaining: 13.7s
123:	learn: 0.0070355	total: 13.1s	remaining: 13.6s
124:	learn: 0.0069985	total: 13.1s	remaining: 13.5s
125:	learn: 0.0069095	total: 13.2s	remaining: 13.3s
126:	learn: 0.0068439	total: 13.3s	remaining: 13.2s
127:	learn: 0.0067698	total: 13.4s	remaining: 13.1s
128:	learn: 0.0066882	total: 13.5s	remaining: 13s
129:	learn: 0.0065991	total: 13.6s	remaining: 12.8s
130:	learn: 0.0065240	total: 13.7s	remaining: 12.7s
131:	learn: 0.0064808	total: 13.8s	remaining: 12.6s
132:	learn: 0.0064339	total: 13.8s	remaining: 12.5s
133:	learn: 0.0063462	total: 13.9s	remaining: 12.4s
134:	learn: 0.0062617	total: 14s	remaining: 12.2s
135:	learn: 0.0062377	total: 14.1s	remaining: 12.1s
136:	learn: 0.0061985	total: 14.2s	remaining: 12s
137:	learn: 0.0061344	total: 14.3s	remaining: 11.9s
138:	learn: 0.0060772	total: 14.3s	remaining: 11.8s
139:	learn: 0.0060217	total: 14.4s	remaining: 11.6s
140:	learn: 0.0059805	total: 14.5s	remaining: 11.5s
141:	learn: 0.0059135	total: 14.6s	remaining: 11.4s
142:	learn: 0.0058655	total: 14.7s	remaining: 11.3s
143:	learn: 0.0057980	total: 14.8s	remaining: 11.2s
144:	learn: 0.0057568	total: 14.9s	remaining: 11.1s
145:	learn: 0.0057113	total: 14.9s	remaining: 10.9s
146:	learn: 0.0056390	total: 15s	remaining: 10.8s
147:	learn: 0.0055774	total: 15.1s	remaining: 10.7s
148:	learn: 0.0055477	total: 15.2s	remaining: 10.6s
149:	learn: 0.0054944	total: 15.3s	remaining: 10.5s
150:	learn: 0.0054263	total: 15.3s	remaining: 10.4s
151:	learn: 0.0053639	total: 15.4s	remaining: 10.3s
152:	learn: 0.0052837	total: 15.5s	remaining: 10.2s
153:	learn: 0.0052449	total: 15.6s	remaining: 10s
154:	learn: 0.0052215	total: 15.7s	remaining: 9.92s
155:	learn: 0.0051674	total: 15.8s	remaining: 9.81s
156:	learn: 0.0050965	total: 15.9s	remaining: 9.7s
157:	learn: 0.0050396	total: 16s	remaining: 9.59s
158:	learn: 0.0049862	total: 16s	remaining: 9.48s
159:	learn: 0.0049130	total: 16.1s	remaining: 9.37s
160:	learn: 0.0048948	total: 16.2s	remaining: 9.26s
161:	learn: 0.0048642	total: 16.3s	remaining: 9.14s
162:	learn: 0.0048313	total: 16.4s	remaining: 9.03s
163:	learn: 0.0048013	total: 16.4s	remaining: 8.92s
164:	learn: 0.0047664	total: 16.5s	remaining: 8.81s
165:	learn: 0.0047247	total: 16.6s	remaining: 8.7s
166:	learn: 0.0046827	total: 16.7s	remaining: 8.59s
167:	learn: 0.0046826	total: 16.8s	remaining: 8.49s
168:	learn: 0.0045973	total: 16.9s	remaining: 8.38s
169:	learn: 0.0045690	total: 16.9s	remaining: 8.27s
170:	learn: 0.0045436	total: 17s	remaining: 8.16s
171:	learn: 0.0045226	total: 17.1s	remaining: 8.06s
172:	learn: 0.0044859	total: 17.2s	remaining: 7.95s
173:	learn: 0.0044640	total: 17.3s	remaining: 7.84s
174:	learn: 0.0044125	total: 17.4s	remaining: 7.73s
175:	learn: 0.0043777	total: 17.4s	remaining: 7.63s
176:	learn: 0.0043187	total: 17.5s	remaining: 7.52s
177:	learn: 0.0043187	total: 17.6s	remaining: 7.41s
178:	learn: 0.0042981	total: 17.7s	remaining: 7.3s
179:	learn: 0.0042345	total: 17.8s	remaining: 7.2s
180:	learn: 0.0041923	total: 17.9s	remaining: 7.1s
181:	learn: 0.0041591	total: 18s	remaining: 7s
182:	learn: 0.0041471	total: 18s	remaining: 6.89s
183:	learn: 0.0041282	total: 18.1s	remaining: 6.79s
184:	learn: 0.0041081	total: 18.2s	remaining: 6.68s
185:	learn: 0.0040761	total: 18.3s	remaining: 6.58s
186:	learn: 0.0040593	total: 18.3s	remaining: 6.47s
187:	learn: 0.0040405	total: 18.4s	remaining: 6.37s
188:	learn: 0.0039987	total: 18.5s	remaining: 6.26s
189:	learn: 0.0039600	total: 18.6s	remaining: 6.16s
190:	learn: 0.0039411	total: 18.7s	remaining: 6.06s
191:	learn: 0.0038997	total: 18.7s	remaining: 5.96s
192:	learn: 0.0038997	total: 18.8s	remaining: 5.85s
193:	learn: 0.0038687	total: 18.9s	remaining: 5.75s
194:	learn: 0.0038534	total: 19s	remaining: 5.65s
195:	learn: 0.0038398	total: 19.1s	remaining: 5.54s
196:	learn: 0.0037951	total: 19.2s	remaining: 5.44s
197:	learn: 0.0037608	total: 19.2s	remaining: 5.34s
198:	learn: 0.0037096	total: 19.3s	remaining: 5.24s
199:	learn: 0.0036555	total: 19.4s	remaining: 5.14s
200:	learn: 0.0036216	total: 19.5s	remaining: 5.04s
201:	learn: 0.0036216	total: 19.5s	remaining: 4.93s
202:	learn: 0.0036216	total: 19.6s	remaining: 4.83s
203:	learn: 0.0036216	total: 19.7s	remaining: 4.73s
204:	learn: 0.0036012	total: 19.8s	remaining: 4.63s
205:	learn: 0.0036012	total: 19.8s	remaining: 4.52s
206:	learn: 0.0036012	total: 19.9s	remaining: 4.42s
207:	learn: 0.0035886	total: 20s	remaining: 4.32s
208:	learn: 0.0035678	total: 20.1s	remaining: 4.22s
209:	learn: 0.0035678	total: 20.1s	remaining: 4.12s
210:	learn: 0.0035678	total: 20.2s	remaining: 4.02s
211:	learn: 0.0035678	total: 20.3s	remaining: 3.92s
212:	learn: 0.0035678	total: 20.3s	remaining: 3.81s
213:	learn: 0.0035678	total: 20.4s	remaining: 3.72s
214:	learn: 0.0035678	total: 20.5s	remaining: 3.62s
215:	learn: 0.0035678	total: 20.5s	remaining: 3.51s
216:	learn: 0.0035678	total: 20.6s	remaining: 3.41s
217:	learn: 0.0035678	total: 20.7s	remaining: 3.32s
218:	learn: 0.0035407	total: 20.8s	remaining: 3.23s
219:	learn: 0.0035138	total: 21s	remaining: 3.14s
220:	learn: 0.0034621	total: 21.1s	remaining: 3.06s
221:	learn: 0.0034113	total: 21.3s	remaining: 2.97s
222:	learn: 0.0034054	total: 21.4s	remaining: 2.88s
223:	learn: 0.0034054	total: 21.6s	remaining: 2.79s
224:	learn: 0.0034054	total: 21.7s	remaining: 2.7s
225:	learn: 0.0034054	total: 21.8s	remaining: 2.61s
226:	learn: 0.0034054	total: 22s	remaining: 2.51s
227:	learn: 0.0034054	total: 22.1s	remaining: 2.42s
228:	learn: 0.0034054	total: 22.2s	remaining: 2.33s
229:	learn: 0.0033957	total: 22.3s	remaining: 2.23s
230:	learn: 0.0033587	total: 22.5s	remaining: 2.14s
231:	learn: 0.0033587	total: 22.6s	remaining: 2.04s
232:	learn: 0.0033465	total: 22.7s	remaining: 1.95s
233:	learn: 0.0033211	total: 22.9s	remaining: 1.86s
234:	learn: 0.0033210	total: 23s	remaining: 1.76s
235:	learn: 0.0032738	total: 23.2s	remaining: 1.67s
236:	learn: 0.0032738	total: 23.3s	remaining: 1.57s
237:	learn: 0.0032614	total: 23.5s	remaining: 1.48s
238:	learn: 0.0032614	total: 23.6s	remaining: 1.38s
239:	learn: 0.0032614	total: 23.7s	remaining: 1.29s
240:	learn: 0.0032360	total: 23.9s	remaining: 1.19s
241:	learn: 0.0032270	total: 24s	remaining: 1.09s
242:	learn: 0.0032037	total: 24.2s	remaining: 996ms
243:	learn: 0.0031881	total: 24.4s	remaining: 898ms
244:	learn: 0.0031584	total: 24.5s	remaining: 800ms
245:	learn: 0.0031584	total: 24.6s	remaining: 700ms
246:	learn: 0.0031584	total: 24.7s	remaining: 600ms
247:	learn: 0.0031584	total: 24.8s	remaining: 500ms
248:	learn: 0.0031584	total: 24.9s	remaining: 401ms
249:	learn: 0.0031584	total: 25s	remaining: 301ms
250:	learn: 0.0031584	total: 25.1s	remaining: 200ms
251:	learn: 0.0031584	total: 25.3s	remaining: 100ms
252:	learn: 0.0031584	total: 25.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.59
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.79
 - F1-Score_Train: 99.79
 - Precision_Test: 21.48
 - Recall_Test: 87.30
 - AUPRC_Test: 79.51
 - Accuracy_Test: 99.44
 - F1-Score_Test: 34.48
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 253
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.54
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5303676	total: 75.1ms	remaining: 18.9s
1:	learn: 0.4173516	total: 148ms	remaining: 18.6s
2:	learn: 0.3326623	total: 222ms	remaining: 18.5s
3:	learn: 0.2557673	total: 317ms	remaining: 19.7s
4:	learn: 0.2147103	total: 397ms	remaining: 19.7s
5:	learn: 0.1859630	total: 478ms	remaining: 19.7s
6:	learn: 0.1669258	total: 579ms	remaining: 20.3s
7:	learn: 0.1552486	total: 655ms	remaining: 20.1s
8:	learn: 0.1392689	total: 730ms	remaining: 19.8s
9:	learn: 0.1263417	total: 823ms	remaining: 20s
10:	learn: 0.1153060	total: 907ms	remaining: 20s
11:	learn: 0.1056862	total: 988ms	remaining: 19.8s
12:	learn: 0.0988941	total: 1.08s	remaining: 19.9s
13:	learn: 0.0931678	total: 1.16s	remaining: 19.8s
14:	learn: 0.0888577	total: 1.24s	remaining: 19.7s
15:	learn: 0.0861693	total: 1.33s	remaining: 19.7s
16:	learn: 0.0799501	total: 1.42s	remaining: 19.7s
17:	learn: 0.0766380	total: 1.5s	remaining: 19.5s
18:	learn: 0.0741166	total: 1.6s	remaining: 19.7s
19:	learn: 0.0707883	total: 1.68s	remaining: 19.6s
20:	learn: 0.0678610	total: 1.78s	remaining: 19.7s
21:	learn: 0.0659470	total: 1.87s	remaining: 19.6s
22:	learn: 0.0632684	total: 1.96s	remaining: 19.6s
23:	learn: 0.0611282	total: 2.04s	remaining: 19.5s
24:	learn: 0.0585871	total: 2.14s	remaining: 19.5s
25:	learn: 0.0566002	total: 2.21s	remaining: 19.3s
26:	learn: 0.0553341	total: 2.29s	remaining: 19.2s
27:	learn: 0.0539399	total: 2.38s	remaining: 19.2s
28:	learn: 0.0528930	total: 2.46s	remaining: 19s
29:	learn: 0.0517105	total: 2.54s	remaining: 18.9s
30:	learn: 0.0499944	total: 2.65s	remaining: 19s
31:	learn: 0.0482474	total: 2.74s	remaining: 18.9s
32:	learn: 0.0466881	total: 2.82s	remaining: 18.8s
33:	learn: 0.0457528	total: 2.91s	remaining: 18.8s
34:	learn: 0.0446794	total: 2.99s	remaining: 18.6s
35:	learn: 0.0435194	total: 3.08s	remaining: 18.6s
36:	learn: 0.0423423	total: 3.19s	remaining: 18.6s
37:	learn: 0.0411438	total: 3.27s	remaining: 18.5s
38:	learn: 0.0402466	total: 3.35s	remaining: 18.4s
39:	learn: 0.0391563	total: 3.44s	remaining: 18.3s
40:	learn: 0.0386148	total: 3.51s	remaining: 18.2s
41:	learn: 0.0373911	total: 3.59s	remaining: 18s
42:	learn: 0.0364791	total: 3.7s	remaining: 18.1s
43:	learn: 0.0358807	total: 3.77s	remaining: 17.9s
44:	learn: 0.0348771	total: 3.85s	remaining: 17.8s
45:	learn: 0.0343205	total: 3.95s	remaining: 17.8s
46:	learn: 0.0335650	total: 4.03s	remaining: 17.7s
47:	learn: 0.0326830	total: 4.11s	remaining: 17.5s
48:	learn: 0.0321219	total: 4.2s	remaining: 17.5s
49:	learn: 0.0317086	total: 4.28s	remaining: 17.4s
50:	learn: 0.0308637	total: 4.37s	remaining: 17.3s
51:	learn: 0.0302666	total: 4.46s	remaining: 17.3s
52:	learn: 0.0296008	total: 4.54s	remaining: 17.1s
53:	learn: 0.0290667	total: 4.63s	remaining: 17.1s
54:	learn: 0.0285032	total: 4.74s	remaining: 17.1s
55:	learn: 0.0280067	total: 4.82s	remaining: 17s
56:	learn: 0.0275594	total: 4.89s	remaining: 16.8s
57:	learn: 0.0269761	total: 5s	remaining: 16.8s
58:	learn: 0.0264576	total: 5.08s	remaining: 16.7s
59:	learn: 0.0260740	total: 5.16s	remaining: 16.6s
60:	learn: 0.0256590	total: 5.24s	remaining: 16.5s
61:	learn: 0.0250942	total: 5.33s	remaining: 16.4s
62:	learn: 0.0246690	total: 5.4s	remaining: 16.3s
63:	learn: 0.0242482	total: 5.5s	remaining: 16.2s
64:	learn: 0.0240252	total: 5.57s	remaining: 16.1s
65:	learn: 0.0236488	total: 5.65s	remaining: 16s
66:	learn: 0.0233605	total: 5.75s	remaining: 16s
67:	learn: 0.0230435	total: 5.83s	remaining: 15.9s
68:	learn: 0.0227200	total: 5.91s	remaining: 15.8s
69:	learn: 0.0223502	total: 6s	remaining: 15.7s
70:	learn: 0.0220298	total: 6.08s	remaining: 15.6s
71:	learn: 0.0215547	total: 6.16s	remaining: 15.5s
72:	learn: 0.0212779	total: 6.26s	remaining: 15.4s
73:	learn: 0.0209753	total: 6.33s	remaining: 15.3s
74:	learn: 0.0205444	total: 6.42s	remaining: 15.2s
75:	learn: 0.0202520	total: 6.51s	remaining: 15.2s
76:	learn: 0.0200128	total: 6.58s	remaining: 15.1s
77:	learn: 0.0197290	total: 6.66s	remaining: 14.9s
78:	learn: 0.0194965	total: 6.75s	remaining: 14.9s
79:	learn: 0.0192227	total: 6.84s	remaining: 14.8s
80:	learn: 0.0188386	total: 6.93s	remaining: 14.7s
81:	learn: 0.0185126	total: 7.02s	remaining: 14.6s
82:	learn: 0.0182609	total: 7.09s	remaining: 14.5s
83:	learn: 0.0180007	total: 7.17s	remaining: 14.4s
84:	learn: 0.0177877	total: 7.26s	remaining: 14.3s
85:	learn: 0.0176588	total: 7.33s	remaining: 14.2s
86:	learn: 0.0174138	total: 7.42s	remaining: 14.1s
87:	learn: 0.0170912	total: 7.52s	remaining: 14.1s
88:	learn: 0.0167697	total: 7.61s	remaining: 14s
89:	learn: 0.0166376	total: 7.68s	remaining: 13.9s
90:	learn: 0.0164011	total: 7.77s	remaining: 13.8s
91:	learn: 0.0161584	total: 7.86s	remaining: 13.8s
92:	learn: 0.0159198	total: 7.95s	remaining: 13.7s
93:	learn: 0.0157670	total: 8.04s	remaining: 13.6s
94:	learn: 0.0154968	total: 8.11s	remaining: 13.5s
95:	learn: 0.0152381	total: 8.2s	remaining: 13.4s
96:	learn: 0.0150734	total: 8.3s	remaining: 13.3s
97:	learn: 0.0149240	total: 8.44s	remaining: 13.3s
98:	learn: 0.0147233	total: 8.58s	remaining: 13.3s
99:	learn: 0.0146487	total: 8.72s	remaining: 13.3s
100:	learn: 0.0145270	total: 8.85s	remaining: 13.3s
101:	learn: 0.0142796	total: 9.02s	remaining: 13.3s
102:	learn: 0.0140468	total: 9.18s	remaining: 13.4s
103:	learn: 0.0139455	total: 9.33s	remaining: 13.4s
104:	learn: 0.0138548	total: 9.49s	remaining: 13.4s
105:	learn: 0.0136459	total: 9.64s	remaining: 13.4s
106:	learn: 0.0134777	total: 9.79s	remaining: 13.4s
107:	learn: 0.0133365	total: 9.96s	remaining: 13.4s
108:	learn: 0.0132344	total: 10.1s	remaining: 13.3s
109:	learn: 0.0130157	total: 10.3s	remaining: 13.3s
110:	learn: 0.0128426	total: 10.4s	remaining: 13.3s
111:	learn: 0.0127081	total: 10.6s	remaining: 13.3s
112:	learn: 0.0125967	total: 10.7s	remaining: 13.3s
113:	learn: 0.0124401	total: 10.9s	remaining: 13.2s
114:	learn: 0.0123213	total: 11s	remaining: 13.2s
115:	learn: 0.0121462	total: 11.2s	remaining: 13.2s
116:	learn: 0.0120168	total: 11.3s	remaining: 13.1s
117:	learn: 0.0118606	total: 11.5s	remaining: 13.1s
118:	learn: 0.0116472	total: 11.6s	remaining: 13.1s
119:	learn: 0.0114635	total: 11.8s	remaining: 13.1s
120:	learn: 0.0113743	total: 12s	remaining: 13s
121:	learn: 0.0112993	total: 12.1s	remaining: 13s
122:	learn: 0.0111796	total: 12.2s	remaining: 12.9s
123:	learn: 0.0109862	total: 12.4s	remaining: 12.9s
124:	learn: 0.0109007	total: 12.6s	remaining: 12.9s
125:	learn: 0.0107001	total: 12.7s	remaining: 12.8s
126:	learn: 0.0106173	total: 12.9s	remaining: 12.8s
127:	learn: 0.0104676	total: 13s	remaining: 12.7s
128:	learn: 0.0103878	total: 13.2s	remaining: 12.7s
129:	learn: 0.0102733	total: 13.3s	remaining: 12.6s
130:	learn: 0.0101436	total: 13.5s	remaining: 12.6s
131:	learn: 0.0100645	total: 13.7s	remaining: 12.5s
132:	learn: 0.0099320	total: 13.8s	remaining: 12.5s
133:	learn: 0.0098618	total: 13.9s	remaining: 12.4s
134:	learn: 0.0097330	total: 14s	remaining: 12.2s
135:	learn: 0.0096289	total: 14.1s	remaining: 12.1s
136:	learn: 0.0094620	total: 14.2s	remaining: 12s
137:	learn: 0.0093397	total: 14.3s	remaining: 11.9s
138:	learn: 0.0092508	total: 14.4s	remaining: 11.8s
139:	learn: 0.0091688	total: 14.4s	remaining: 11.7s
140:	learn: 0.0090623	total: 14.5s	remaining: 11.5s
141:	learn: 0.0089902	total: 14.6s	remaining: 11.4s
142:	learn: 0.0088575	total: 14.7s	remaining: 11.3s
143:	learn: 0.0087145	total: 14.8s	remaining: 11.2s
144:	learn: 0.0085821	total: 14.8s	remaining: 11.1s
145:	learn: 0.0085047	total: 14.9s	remaining: 10.9s
146:	learn: 0.0083958	total: 15s	remaining: 10.8s
147:	learn: 0.0083607	total: 15.1s	remaining: 10.7s
148:	learn: 0.0082492	total: 15.2s	remaining: 10.6s
149:	learn: 0.0081528	total: 15.3s	remaining: 10.5s
150:	learn: 0.0080959	total: 15.4s	remaining: 10.4s
151:	learn: 0.0080659	total: 15.4s	remaining: 10.3s
152:	learn: 0.0080382	total: 15.5s	remaining: 10.1s
153:	learn: 0.0079209	total: 15.6s	remaining: 10s
154:	learn: 0.0078650	total: 15.7s	remaining: 9.9s
155:	learn: 0.0078184	total: 15.7s	remaining: 9.78s
156:	learn: 0.0076764	total: 15.8s	remaining: 9.68s
157:	learn: 0.0076449	total: 15.9s	remaining: 9.56s
158:	learn: 0.0075823	total: 16s	remaining: 9.45s
159:	learn: 0.0075400	total: 16.1s	remaining: 9.35s
160:	learn: 0.0074435	total: 16.2s	remaining: 9.23s
161:	learn: 0.0073796	total: 16.2s	remaining: 9.13s
162:	learn: 0.0072909	total: 16.3s	remaining: 9.02s
163:	learn: 0.0072164	total: 16.4s	remaining: 8.91s
164:	learn: 0.0071970	total: 16.5s	remaining: 8.79s
165:	learn: 0.0071734	total: 16.6s	remaining: 8.69s
166:	learn: 0.0071110	total: 16.6s	remaining: 8.57s
167:	learn: 0.0070291	total: 16.7s	remaining: 8.46s
168:	learn: 0.0069494	total: 16.8s	remaining: 8.36s
169:	learn: 0.0069221	total: 16.9s	remaining: 8.25s
170:	learn: 0.0068451	total: 17s	remaining: 8.14s
171:	learn: 0.0067671	total: 17.1s	remaining: 8.03s
172:	learn: 0.0067531	total: 17.1s	remaining: 7.92s
173:	learn: 0.0067207	total: 17.2s	remaining: 7.82s
174:	learn: 0.0066933	total: 17.3s	remaining: 7.71s
175:	learn: 0.0066134	total: 17.4s	remaining: 7.6s
176:	learn: 0.0065404	total: 17.5s	remaining: 7.5s
177:	learn: 0.0064857	total: 17.5s	remaining: 7.39s
178:	learn: 0.0064701	total: 17.6s	remaining: 7.28s
179:	learn: 0.0064313	total: 17.7s	remaining: 7.17s
180:	learn: 0.0063413	total: 17.8s	remaining: 7.07s
181:	learn: 0.0062377	total: 17.9s	remaining: 6.96s
182:	learn: 0.0061890	total: 17.9s	remaining: 6.86s
183:	learn: 0.0061615	total: 18s	remaining: 6.76s
184:	learn: 0.0061367	total: 18.1s	remaining: 6.65s
185:	learn: 0.0061253	total: 18.2s	remaining: 6.54s
186:	learn: 0.0060527	total: 18.3s	remaining: 6.45s
187:	learn: 0.0060050	total: 18.4s	remaining: 6.35s
188:	learn: 0.0059494	total: 18.4s	remaining: 6.24s
189:	learn: 0.0059066	total: 18.6s	remaining: 6.15s
190:	learn: 0.0058664	total: 18.6s	remaining: 6.05s
191:	learn: 0.0058098	total: 18.7s	remaining: 5.94s
192:	learn: 0.0057804	total: 18.8s	remaining: 5.84s
193:	learn: 0.0057236	total: 18.9s	remaining: 5.74s
194:	learn: 0.0056887	total: 19s	remaining: 5.64s
195:	learn: 0.0056562	total: 19s	remaining: 5.54s
196:	learn: 0.0055943	total: 19.1s	remaining: 5.44s
197:	learn: 0.0055413	total: 19.2s	remaining: 5.33s
198:	learn: 0.0055171	total: 19.3s	remaining: 5.24s
199:	learn: 0.0054991	total: 19.4s	remaining: 5.13s
200:	learn: 0.0053627	total: 19.5s	remaining: 5.03s
201:	learn: 0.0053410	total: 19.5s	remaining: 4.93s
202:	learn: 0.0052708	total: 19.6s	remaining: 4.83s
203:	learn: 0.0052493	total: 19.7s	remaining: 4.73s
204:	learn: 0.0051798	total: 19.8s	remaining: 4.63s
205:	learn: 0.0051047	total: 19.9s	remaining: 4.54s
206:	learn: 0.0050366	total: 20s	remaining: 4.43s
207:	learn: 0.0050125	total: 20s	remaining: 4.34s
208:	learn: 0.0049921	total: 20.1s	remaining: 4.23s
209:	learn: 0.0049921	total: 20.2s	remaining: 4.13s
210:	learn: 0.0049715	total: 20.3s	remaining: 4.03s
211:	learn: 0.0049290	total: 20.4s	remaining: 3.94s
212:	learn: 0.0048987	total: 20.4s	remaining: 3.84s
213:	learn: 0.0048716	total: 20.5s	remaining: 3.74s
214:	learn: 0.0048567	total: 20.6s	remaining: 3.64s
215:	learn: 0.0048229	total: 20.7s	remaining: 3.54s
216:	learn: 0.0048035	total: 20.8s	remaining: 3.44s
217:	learn: 0.0047625	total: 20.8s	remaining: 3.35s
218:	learn: 0.0047450	total: 20.9s	remaining: 3.25s
219:	learn: 0.0046892	total: 21s	remaining: 3.15s
220:	learn: 0.0046553	total: 21.1s	remaining: 3.06s
221:	learn: 0.0046326	total: 21.2s	remaining: 2.96s
222:	learn: 0.0045589	total: 21.3s	remaining: 2.86s
223:	learn: 0.0045118	total: 21.4s	remaining: 2.77s
224:	learn: 0.0045118	total: 21.4s	remaining: 2.67s
225:	learn: 0.0044798	total: 21.5s	remaining: 2.57s
226:	learn: 0.0044533	total: 21.6s	remaining: 2.47s
227:	learn: 0.0044028	total: 21.7s	remaining: 2.38s
228:	learn: 0.0044028	total: 21.7s	remaining: 2.28s
229:	learn: 0.0044028	total: 21.8s	remaining: 2.18s
230:	learn: 0.0043805	total: 21.9s	remaining: 2.08s
231:	learn: 0.0043804	total: 21.9s	remaining: 1.99s
232:	learn: 0.0043804	total: 22s	remaining: 1.89s
233:	learn: 0.0043657	total: 22.1s	remaining: 1.79s
234:	learn: 0.0043656	total: 22.1s	remaining: 1.7s
235:	learn: 0.0043656	total: 22.2s	remaining: 1.6s
236:	learn: 0.0043656	total: 22.3s	remaining: 1.5s
237:	learn: 0.0043656	total: 22.3s	remaining: 1.41s
238:	learn: 0.0043656	total: 22.4s	remaining: 1.31s
239:	learn: 0.0043656	total: 22.5s	remaining: 1.22s
240:	learn: 0.0043656	total: 22.5s	remaining: 1.12s
241:	learn: 0.0043656	total: 22.6s	remaining: 1.03s
242:	learn: 0.0043656	total: 22.6s	remaining: 932ms
243:	learn: 0.0043656	total: 22.7s	remaining: 837ms
244:	learn: 0.0043656	total: 22.8s	remaining: 743ms
245:	learn: 0.0043656	total: 22.8s	remaining: 649ms
246:	learn: 0.0043656	total: 22.9s	remaining: 556ms
247:	learn: 0.0043656	total: 22.9s	remaining: 462ms
248:	learn: 0.0043656	total: 23s	remaining: 370ms
249:	learn: 0.0043656	total: 23.1s	remaining: 277ms
250:	learn: 0.0043655	total: 23.1s	remaining: 184ms
251:	learn: 0.0043655	total: 23.2s	remaining: 92ms
252:	learn: 0.0043655	total: 23.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.45
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.72
 - F1-Score_Train: 99.73
 - Precision_Test: 19.61
 - Recall_Test: 88.10
 - AUPRC_Test: 74.19
 - Accuracy_Test: 99.37
 - F1-Score_Test: 32.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 253
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.54
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5228661	total: 151ms	remaining: 38s
1:	learn: 0.3859115	total: 300ms	remaining: 37.6s
2:	learn: 0.2964520	total: 460ms	remaining: 38.4s
3:	learn: 0.2442947	total: 621ms	remaining: 38.6s
4:	learn: 0.2017852	total: 803ms	remaining: 39.8s
5:	learn: 0.1650164	total: 953ms	remaining: 39.2s
6:	learn: 0.1456196	total: 1.12s	remaining: 39.5s
7:	learn: 0.1260202	total: 1.27s	remaining: 39.1s
8:	learn: 0.1152291	total: 1.42s	remaining: 38.6s
9:	learn: 0.1028731	total: 1.57s	remaining: 38.3s
10:	learn: 0.0966118	total: 1.72s	remaining: 37.7s
11:	learn: 0.0897459	total: 1.89s	remaining: 37.9s
12:	learn: 0.0844707	total: 2.05s	remaining: 37.8s
13:	learn: 0.0802266	total: 2.2s	remaining: 37.5s
14:	learn: 0.0768431	total: 2.35s	remaining: 37.4s
15:	learn: 0.0733014	total: 2.47s	remaining: 36.6s
16:	learn: 0.0709369	total: 2.55s	remaining: 35.5s
17:	learn: 0.0668465	total: 2.65s	remaining: 34.6s
18:	learn: 0.0650324	total: 2.73s	remaining: 33.6s
19:	learn: 0.0627234	total: 2.8s	remaining: 32.6s
20:	learn: 0.0590240	total: 2.9s	remaining: 32s
21:	learn: 0.0564690	total: 2.98s	remaining: 31.3s
22:	learn: 0.0549917	total: 3.06s	remaining: 30.6s
23:	learn: 0.0532220	total: 3.15s	remaining: 30.1s
24:	learn: 0.0520351	total: 3.23s	remaining: 29.4s
25:	learn: 0.0497714	total: 3.31s	remaining: 28.9s
26:	learn: 0.0487236	total: 3.39s	remaining: 28.4s
27:	learn: 0.0472644	total: 3.47s	remaining: 27.9s
28:	learn: 0.0450672	total: 3.56s	remaining: 27.5s
29:	learn: 0.0442380	total: 3.64s	remaining: 27.1s
30:	learn: 0.0427646	total: 3.73s	remaining: 26.7s
31:	learn: 0.0417671	total: 3.81s	remaining: 26.3s
32:	learn: 0.0407746	total: 3.91s	remaining: 26s
33:	learn: 0.0397808	total: 3.98s	remaining: 25.7s
34:	learn: 0.0385553	total: 4.07s	remaining: 25.3s
35:	learn: 0.0378482	total: 4.15s	remaining: 25s
36:	learn: 0.0370179	total: 4.23s	remaining: 24.7s
37:	learn: 0.0361007	total: 4.31s	remaining: 24.4s
38:	learn: 0.0350321	total: 4.41s	remaining: 24.2s
39:	learn: 0.0341625	total: 4.48s	remaining: 23.9s
40:	learn: 0.0331403	total: 4.58s	remaining: 23.7s
41:	learn: 0.0324107	total: 4.69s	remaining: 23.6s
42:	learn: 0.0318678	total: 4.77s	remaining: 23.3s
43:	learn: 0.0314414	total: 4.84s	remaining: 23s
44:	learn: 0.0304612	total: 4.93s	remaining: 22.8s
45:	learn: 0.0298192	total: 5.02s	remaining: 22.6s
46:	learn: 0.0290081	total: 5.1s	remaining: 22.4s
47:	learn: 0.0283775	total: 5.19s	remaining: 22.2s
48:	learn: 0.0277561	total: 5.27s	remaining: 21.9s
49:	learn: 0.0272389	total: 5.35s	remaining: 21.7s
50:	learn: 0.0266377	total: 5.45s	remaining: 21.6s
51:	learn: 0.0262966	total: 5.53s	remaining: 21.4s
52:	learn: 0.0258144	total: 5.62s	remaining: 21.2s
53:	learn: 0.0250502	total: 5.74s	remaining: 21.2s
54:	learn: 0.0245871	total: 5.82s	remaining: 21s
55:	learn: 0.0241476	total: 5.9s	remaining: 20.8s
56:	learn: 0.0236811	total: 6s	remaining: 20.6s
57:	learn: 0.0234814	total: 6.07s	remaining: 20.4s
58:	learn: 0.0230181	total: 6.15s	remaining: 20.2s
59:	learn: 0.0226247	total: 6.24s	remaining: 20.1s
60:	learn: 0.0221394	total: 6.32s	remaining: 19.9s
61:	learn: 0.0216348	total: 6.41s	remaining: 19.7s
62:	learn: 0.0212011	total: 6.5s	remaining: 19.6s
63:	learn: 0.0210037	total: 6.57s	remaining: 19.4s
64:	learn: 0.0206026	total: 6.67s	remaining: 19.3s
65:	learn: 0.0203538	total: 6.77s	remaining: 19.2s
66:	learn: 0.0201513	total: 6.84s	remaining: 19s
67:	learn: 0.0198734	total: 6.91s	remaining: 18.8s
68:	learn: 0.0195295	total: 7s	remaining: 18.7s
69:	learn: 0.0191808	total: 7.08s	remaining: 18.5s
70:	learn: 0.0189026	total: 7.15s	remaining: 18.3s
71:	learn: 0.0185650	total: 7.26s	remaining: 18.2s
72:	learn: 0.0182942	total: 7.34s	remaining: 18.1s
73:	learn: 0.0180469	total: 7.41s	remaining: 17.9s
74:	learn: 0.0177733	total: 7.5s	remaining: 17.8s
75:	learn: 0.0174806	total: 7.59s	remaining: 17.7s
76:	learn: 0.0172220	total: 7.69s	remaining: 17.6s
77:	learn: 0.0169513	total: 7.78s	remaining: 17.5s
78:	learn: 0.0166823	total: 7.85s	remaining: 17.3s
79:	learn: 0.0164467	total: 7.93s	remaining: 17.1s
80:	learn: 0.0160885	total: 8.02s	remaining: 17s
81:	learn: 0.0159119	total: 8.09s	remaining: 16.9s
82:	learn: 0.0156274	total: 8.17s	remaining: 16.7s
83:	learn: 0.0154071	total: 8.27s	remaining: 16.6s
84:	learn: 0.0152288	total: 8.35s	remaining: 16.5s
85:	learn: 0.0149901	total: 8.43s	remaining: 16.4s
86:	learn: 0.0147650	total: 8.52s	remaining: 16.3s
87:	learn: 0.0146401	total: 8.59s	remaining: 16.1s
88:	learn: 0.0144320	total: 8.71s	remaining: 16s
89:	learn: 0.0142909	total: 8.79s	remaining: 15.9s
90:	learn: 0.0140530	total: 8.87s	remaining: 15.8s
91:	learn: 0.0137015	total: 8.95s	remaining: 15.7s
92:	learn: 0.0134486	total: 9.04s	remaining: 15.6s
93:	learn: 0.0132190	total: 9.12s	remaining: 15.4s
94:	learn: 0.0130605	total: 9.2s	remaining: 15.3s
95:	learn: 0.0129091	total: 9.28s	remaining: 15.2s
96:	learn: 0.0127380	total: 9.36s	remaining: 15.1s
97:	learn: 0.0126389	total: 9.44s	remaining: 14.9s
98:	learn: 0.0124338	total: 9.54s	remaining: 14.8s
99:	learn: 0.0121543	total: 9.62s	remaining: 14.7s
100:	learn: 0.0120538	total: 9.71s	remaining: 14.6s
101:	learn: 0.0119026	total: 9.82s	remaining: 14.5s
102:	learn: 0.0117690	total: 9.9s	remaining: 14.4s
103:	learn: 0.0116413	total: 9.97s	remaining: 14.3s
104:	learn: 0.0115512	total: 10.1s	remaining: 14.2s
105:	learn: 0.0114040	total: 10.1s	remaining: 14.1s
106:	learn: 0.0112091	total: 10.2s	remaining: 14s
107:	learn: 0.0110537	total: 10.3s	remaining: 13.9s
108:	learn: 0.0109294	total: 10.4s	remaining: 13.7s
109:	learn: 0.0108098	total: 10.5s	remaining: 13.6s
110:	learn: 0.0106806	total: 10.6s	remaining: 13.5s
111:	learn: 0.0106497	total: 10.7s	remaining: 13.4s
112:	learn: 0.0105612	total: 10.7s	remaining: 13.3s
113:	learn: 0.0104475	total: 10.8s	remaining: 13.2s
114:	learn: 0.0102869	total: 10.9s	remaining: 13.1s
115:	learn: 0.0100802	total: 11s	remaining: 13s
116:	learn: 0.0099460	total: 11.1s	remaining: 12.9s
117:	learn: 0.0098206	total: 11.2s	remaining: 12.8s
118:	learn: 0.0097794	total: 11.2s	remaining: 12.7s
119:	learn: 0.0097294	total: 11.3s	remaining: 12.6s
120:	learn: 0.0096863	total: 11.4s	remaining: 12.4s
121:	learn: 0.0095484	total: 11.5s	remaining: 12.3s
122:	learn: 0.0094449	total: 11.6s	remaining: 12.3s
123:	learn: 0.0092543	total: 11.7s	remaining: 12.2s
124:	learn: 0.0091679	total: 11.8s	remaining: 12s
125:	learn: 0.0090783	total: 11.9s	remaining: 11.9s
126:	learn: 0.0090244	total: 11.9s	remaining: 11.8s
127:	learn: 0.0089504	total: 12s	remaining: 11.7s
128:	learn: 0.0088574	total: 12.1s	remaining: 11.6s
129:	learn: 0.0087867	total: 12.2s	remaining: 11.5s
130:	learn: 0.0087325	total: 12.2s	remaining: 11.4s
131:	learn: 0.0086486	total: 12.3s	remaining: 11.3s
132:	learn: 0.0084810	total: 12.5s	remaining: 11.2s
133:	learn: 0.0084090	total: 12.6s	remaining: 11.2s
134:	learn: 0.0083719	total: 12.7s	remaining: 11.1s
135:	learn: 0.0082729	total: 12.9s	remaining: 11.1s
136:	learn: 0.0081867	total: 13s	remaining: 11s
137:	learn: 0.0081162	total: 13.2s	remaining: 11s
138:	learn: 0.0080073	total: 13.4s	remaining: 11s
139:	learn: 0.0079252	total: 13.5s	remaining: 10.9s
140:	learn: 0.0077980	total: 13.7s	remaining: 10.9s
141:	learn: 0.0077503	total: 13.9s	remaining: 10.8s
142:	learn: 0.0076815	total: 14s	remaining: 10.8s
143:	learn: 0.0076053	total: 14.2s	remaining: 10.7s
144:	learn: 0.0075284	total: 14.3s	remaining: 10.7s
145:	learn: 0.0074396	total: 14.5s	remaining: 10.6s
146:	learn: 0.0073226	total: 14.7s	remaining: 10.6s
147:	learn: 0.0072729	total: 14.8s	remaining: 10.5s
148:	learn: 0.0072016	total: 15s	remaining: 10.4s
149:	learn: 0.0070560	total: 15.1s	remaining: 10.4s
150:	learn: 0.0069561	total: 15.3s	remaining: 10.3s
151:	learn: 0.0068653	total: 15.4s	remaining: 10.3s
152:	learn: 0.0067881	total: 15.6s	remaining: 10.2s
153:	learn: 0.0067377	total: 15.8s	remaining: 10.1s
154:	learn: 0.0066996	total: 15.9s	remaining: 10s
155:	learn: 0.0066680	total: 16s	remaining: 9.96s
156:	learn: 0.0065566	total: 16.2s	remaining: 9.89s
157:	learn: 0.0064900	total: 16.3s	remaining: 9.82s
158:	learn: 0.0064612	total: 16.5s	remaining: 9.74s
159:	learn: 0.0064269	total: 16.6s	remaining: 9.66s
160:	learn: 0.0063882	total: 16.8s	remaining: 9.6s
161:	learn: 0.0063084	total: 17s	remaining: 9.53s
162:	learn: 0.0062604	total: 17.1s	remaining: 9.44s
163:	learn: 0.0062280	total: 17.2s	remaining: 9.36s
164:	learn: 0.0061615	total: 17.3s	remaining: 9.25s
165:	learn: 0.0060640	total: 17.4s	remaining: 9.13s
166:	learn: 0.0060276	total: 17.5s	remaining: 9.01s
167:	learn: 0.0060015	total: 17.6s	remaining: 8.89s
168:	learn: 0.0059112	total: 17.7s	remaining: 8.78s
169:	learn: 0.0058656	total: 17.8s	remaining: 8.67s
170:	learn: 0.0058140	total: 17.8s	remaining: 8.55s
171:	learn: 0.0057721	total: 17.9s	remaining: 8.43s
172:	learn: 0.0057300	total: 18s	remaining: 8.32s
173:	learn: 0.0056713	total: 18.1s	remaining: 8.21s
174:	learn: 0.0055980	total: 18.2s	remaining: 8.1s
175:	learn: 0.0055433	total: 18.3s	remaining: 7.99s
176:	learn: 0.0055096	total: 18.3s	remaining: 7.87s
177:	learn: 0.0054363	total: 18.4s	remaining: 7.76s
178:	learn: 0.0053965	total: 18.5s	remaining: 7.65s
179:	learn: 0.0053460	total: 18.6s	remaining: 7.53s
180:	learn: 0.0053056	total: 18.7s	remaining: 7.42s
181:	learn: 0.0052704	total: 18.8s	remaining: 7.32s
182:	learn: 0.0052187	total: 18.8s	remaining: 7.2s
183:	learn: 0.0051749	total: 18.9s	remaining: 7.09s
184:	learn: 0.0051406	total: 19s	remaining: 6.98s
185:	learn: 0.0051089	total: 19.1s	remaining: 6.88s
186:	learn: 0.0050727	total: 19.2s	remaining: 6.77s
187:	learn: 0.0050611	total: 19.3s	remaining: 6.66s
188:	learn: 0.0050222	total: 19.3s	remaining: 6.55s
189:	learn: 0.0049974	total: 19.4s	remaining: 6.44s
190:	learn: 0.0049643	total: 19.5s	remaining: 6.33s
191:	learn: 0.0049274	total: 19.6s	remaining: 6.22s
192:	learn: 0.0048929	total: 19.7s	remaining: 6.11s
193:	learn: 0.0048680	total: 19.7s	remaining: 6s
194:	learn: 0.0048365	total: 19.8s	remaining: 5.89s
195:	learn: 0.0047940	total: 19.9s	remaining: 5.79s
196:	learn: 0.0047423	total: 20s	remaining: 5.68s
197:	learn: 0.0047046	total: 20.1s	remaining: 5.58s
198:	learn: 0.0046908	total: 20.1s	remaining: 5.47s
199:	learn: 0.0046526	total: 20.2s	remaining: 5.36s
200:	learn: 0.0045990	total: 20.3s	remaining: 5.25s
201:	learn: 0.0045784	total: 20.4s	remaining: 5.15s
202:	learn: 0.0045165	total: 20.5s	remaining: 5.05s
203:	learn: 0.0044806	total: 20.6s	remaining: 4.94s
204:	learn: 0.0044806	total: 20.6s	remaining: 4.83s
205:	learn: 0.0044202	total: 20.7s	remaining: 4.73s
206:	learn: 0.0044103	total: 20.8s	remaining: 4.62s
207:	learn: 0.0043994	total: 20.9s	remaining: 4.51s
208:	learn: 0.0043809	total: 20.9s	remaining: 4.41s
209:	learn: 0.0043471	total: 21s	remaining: 4.3s
210:	learn: 0.0043013	total: 21.1s	remaining: 4.2s
211:	learn: 0.0042759	total: 21.2s	remaining: 4.1s
212:	learn: 0.0042589	total: 21.3s	remaining: 4s
213:	learn: 0.0042589	total: 21.3s	remaining: 3.89s
214:	learn: 0.0042444	total: 21.4s	remaining: 3.79s
215:	learn: 0.0042444	total: 21.5s	remaining: 3.68s
216:	learn: 0.0042319	total: 21.6s	remaining: 3.58s
217:	learn: 0.0041972	total: 21.7s	remaining: 3.48s
218:	learn: 0.0041667	total: 21.7s	remaining: 3.38s
219:	learn: 0.0041638	total: 21.8s	remaining: 3.27s
220:	learn: 0.0041290	total: 21.9s	remaining: 3.17s
221:	learn: 0.0040869	total: 22s	remaining: 3.07s
222:	learn: 0.0040471	total: 22.1s	remaining: 2.97s
223:	learn: 0.0040088	total: 22.2s	remaining: 2.87s
224:	learn: 0.0039910	total: 22.2s	remaining: 2.77s
225:	learn: 0.0039910	total: 22.3s	remaining: 2.66s
226:	learn: 0.0039910	total: 22.4s	remaining: 2.56s
227:	learn: 0.0039909	total: 22.4s	remaining: 2.46s
228:	learn: 0.0039784	total: 22.5s	remaining: 2.36s
229:	learn: 0.0039490	total: 22.6s	remaining: 2.26s
230:	learn: 0.0039161	total: 22.7s	remaining: 2.16s
231:	learn: 0.0038931	total: 22.7s	remaining: 2.06s
232:	learn: 0.0038730	total: 22.8s	remaining: 1.96s
233:	learn: 0.0038730	total: 22.9s	remaining: 1.86s
234:	learn: 0.0038730	total: 23s	remaining: 1.76s
235:	learn: 0.0038730	total: 23s	remaining: 1.66s
236:	learn: 0.0038730	total: 23.1s	remaining: 1.56s
237:	learn: 0.0038730	total: 23.2s	remaining: 1.46s
238:	learn: 0.0038576	total: 23.2s	remaining: 1.36s
239:	learn: 0.0038576	total: 23.3s	remaining: 1.26s
240:	learn: 0.0038576	total: 23.4s	remaining: 1.16s
241:	learn: 0.0038448	total: 23.5s	remaining: 1.07s
242:	learn: 0.0038122	total: 23.5s	remaining: 969ms
243:	learn: 0.0037954	total: 23.6s	remaining: 872ms
244:	learn: 0.0037757	total: 23.7s	remaining: 774ms
245:	learn: 0.0037757	total: 23.8s	remaining: 676ms
246:	learn: 0.0037744	total: 23.8s	remaining: 579ms
247:	learn: 0.0037355	total: 23.9s	remaining: 482ms
248:	learn: 0.0037355	total: 24s	remaining: 385ms
249:	learn: 0.0037355	total: 24s	remaining: 288ms
250:	learn: 0.0037355	total: 24.1s	remaining: 192ms
251:	learn: 0.0037237	total: 24.2s	remaining: 96ms
252:	learn: 0.0037059	total: 24.3s	remaining: 0us
[I 2024-12-19 14:27:02,254] Trial 17 finished with value: 76.82995176096074 and parameters: {'learning_rate': 0.07428627404365123, 'max_depth': 5, 'n_estimators': 253, 'scale_pos_weight': 7.539212242386238}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.52
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.76
 - F1-Score_Train: 99.76
 - Precision_Test: 20.77
 - Recall_Test: 85.71
 - AUPRC_Test: 76.79
 - Accuracy_Test: 99.43
 - F1-Score_Test: 33.44
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 253
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.54
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 76.8300

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5263613	total: 80.8ms	remaining: 21.3s
1:	learn: 0.4121834	total: 155ms	remaining: 20.3s
2:	learn: 0.3237400	total: 236ms	remaining: 20.6s
3:	learn: 0.2510401	total: 335ms	remaining: 21.7s
4:	learn: 0.2008746	total: 417ms	remaining: 21.6s
5:	learn: 0.1655795	total: 504ms	remaining: 21.7s
6:	learn: 0.1440500	total: 597ms	remaining: 21.9s
7:	learn: 0.1179604	total: 680ms	remaining: 21.8s
8:	learn: 0.1085345	total: 754ms	remaining: 21.4s
9:	learn: 0.0967816	total: 892ms	remaining: 22.7s
10:	learn: 0.0906761	total: 1.01s	remaining: 23.2s
11:	learn: 0.0794027	total: 1.19s	remaining: 24.9s
12:	learn: 0.0719078	total: 1.31s	remaining: 25.4s
13:	learn: 0.0681825	total: 1.47s	remaining: 26.3s
14:	learn: 0.0645509	total: 1.64s	remaining: 27.1s
15:	learn: 0.0602790	total: 1.79s	remaining: 27.8s
16:	learn: 0.0574632	total: 1.95s	remaining: 28.4s
17:	learn: 0.0541724	total: 2.12s	remaining: 28.9s
18:	learn: 0.0507475	total: 2.29s	remaining: 29.5s
19:	learn: 0.0479903	total: 2.45s	remaining: 29.8s
20:	learn: 0.0456628	total: 2.6s	remaining: 30.1s
21:	learn: 0.0437953	total: 2.76s	remaining: 30.4s
22:	learn: 0.0418204	total: 2.91s	remaining: 30.5s
23:	learn: 0.0406000	total: 3.08s	remaining: 30.8s
24:	learn: 0.0392224	total: 3.22s	remaining: 30.8s
25:	learn: 0.0382365	total: 3.38s	remaining: 31s
26:	learn: 0.0371792	total: 3.54s	remaining: 31.1s
27:	learn: 0.0362086	total: 3.72s	remaining: 31.3s
28:	learn: 0.0349135	total: 3.88s	remaining: 31.5s
29:	learn: 0.0338586	total: 4.05s	remaining: 31.6s
30:	learn: 0.0328548	total: 4.22s	remaining: 31.7s
31:	learn: 0.0320448	total: 4.38s	remaining: 31.7s
32:	learn: 0.0313904	total: 4.52s	remaining: 31.6s
33:	learn: 0.0304941	total: 4.68s	remaining: 31.7s
34:	learn: 0.0298538	total: 4.82s	remaining: 31.5s
35:	learn: 0.0290162	total: 5s	remaining: 31.7s
36:	learn: 0.0283516	total: 5.16s	remaining: 31.6s
37:	learn: 0.0276818	total: 5.31s	remaining: 31.6s
38:	learn: 0.0271351	total: 5.47s	remaining: 31.6s
39:	learn: 0.0267888	total: 5.63s	remaining: 31.5s
40:	learn: 0.0262686	total: 5.79s	remaining: 31.5s
41:	learn: 0.0256981	total: 5.94s	remaining: 31.4s
42:	learn: 0.0248544	total: 6.09s	remaining: 31.3s
43:	learn: 0.0242477	total: 6.24s	remaining: 31.2s
44:	learn: 0.0239071	total: 6.31s	remaining: 30.7s
45:	learn: 0.0234894	total: 6.39s	remaining: 30.3s
46:	learn: 0.0228287	total: 6.49s	remaining: 30s
47:	learn: 0.0223490	total: 6.57s	remaining: 29.6s
48:	learn: 0.0217823	total: 6.65s	remaining: 29.2s
49:	learn: 0.0213385	total: 6.75s	remaining: 28.9s
50:	learn: 0.0209148	total: 6.83s	remaining: 28.5s
51:	learn: 0.0206432	total: 6.91s	remaining: 28.2s
52:	learn: 0.0202696	total: 7.01s	remaining: 27.9s
53:	learn: 0.0199175	total: 7.09s	remaining: 27.6s
54:	learn: 0.0193589	total: 7.17s	remaining: 27.3s
55:	learn: 0.0190615	total: 7.27s	remaining: 27s
56:	learn: 0.0187187	total: 7.35s	remaining: 26.7s
57:	learn: 0.0184897	total: 7.43s	remaining: 26.4s
58:	learn: 0.0181278	total: 7.52s	remaining: 26.1s
59:	learn: 0.0178665	total: 7.59s	remaining: 25.8s
60:	learn: 0.0175966	total: 7.67s	remaining: 25.5s
61:	learn: 0.0173776	total: 7.75s	remaining: 25.2s
62:	learn: 0.0171607	total: 7.82s	remaining: 25s
63:	learn: 0.0168494	total: 7.9s	remaining: 24.7s
64:	learn: 0.0166284	total: 7.99s	remaining: 24.5s
65:	learn: 0.0164357	total: 8.06s	remaining: 24.2s
66:	learn: 0.0162743	total: 8.16s	remaining: 24s
67:	learn: 0.0160609	total: 8.24s	remaining: 23.8s
68:	learn: 0.0158154	total: 8.34s	remaining: 23.6s
69:	learn: 0.0155926	total: 8.41s	remaining: 23.3s
70:	learn: 0.0152651	total: 8.51s	remaining: 23.1s
71:	learn: 0.0149878	total: 8.59s	remaining: 22.9s
72:	learn: 0.0147381	total: 8.66s	remaining: 22.7s
73:	learn: 0.0145273	total: 8.76s	remaining: 22.5s
74:	learn: 0.0143465	total: 8.84s	remaining: 22.3s
75:	learn: 0.0140428	total: 8.91s	remaining: 22s
76:	learn: 0.0138388	total: 9s	remaining: 21.9s
77:	learn: 0.0136867	total: 9.08s	remaining: 21.7s
78:	learn: 0.0135529	total: 9.16s	remaining: 21.5s
79:	learn: 0.0133622	total: 9.26s	remaining: 21.3s
80:	learn: 0.0132306	total: 9.34s	remaining: 21.1s
81:	learn: 0.0130003	total: 9.42s	remaining: 20.9s
82:	learn: 0.0128578	total: 9.52s	remaining: 20.8s
83:	learn: 0.0126923	total: 9.59s	remaining: 20.6s
84:	learn: 0.0125343	total: 9.67s	remaining: 20.4s
85:	learn: 0.0123332	total: 9.77s	remaining: 20.2s
86:	learn: 0.0121965	total: 9.84s	remaining: 20s
87:	learn: 0.0120110	total: 9.91s	remaining: 19.8s
88:	learn: 0.0118880	total: 10s	remaining: 19.7s
89:	learn: 0.0118180	total: 10.1s	remaining: 19.5s
90:	learn: 0.0115811	total: 10.2s	remaining: 19.3s
91:	learn: 0.0114168	total: 10.3s	remaining: 19.2s
92:	learn: 0.0112309	total: 10.3s	remaining: 19s
93:	learn: 0.0110340	total: 10.4s	remaining: 18.9s
94:	learn: 0.0108636	total: 10.5s	remaining: 18.7s
95:	learn: 0.0107322	total: 10.6s	remaining: 18.5s
96:	learn: 0.0106577	total: 10.7s	remaining: 18.4s
97:	learn: 0.0105023	total: 10.8s	remaining: 18.2s
98:	learn: 0.0103922	total: 10.8s	remaining: 18s
99:	learn: 0.0102974	total: 10.9s	remaining: 17.9s
100:	learn: 0.0101747	total: 11s	remaining: 17.7s
101:	learn: 0.0100786	total: 11.1s	remaining: 17.6s
102:	learn: 0.0098894	total: 11.2s	remaining: 17.4s
103:	learn: 0.0097340	total: 11.2s	remaining: 17.3s
104:	learn: 0.0096461	total: 11.3s	remaining: 17.1s
105:	learn: 0.0095375	total: 11.4s	remaining: 17s
106:	learn: 0.0094437	total: 11.5s	remaining: 16.9s
107:	learn: 0.0093536	total: 11.6s	remaining: 16.7s
108:	learn: 0.0091995	total: 11.7s	remaining: 16.6s
109:	learn: 0.0090850	total: 11.7s	remaining: 16.4s
110:	learn: 0.0089128	total: 11.8s	remaining: 16.3s
111:	learn: 0.0088491	total: 11.9s	remaining: 16.2s
112:	learn: 0.0087348	total: 12s	remaining: 16s
113:	learn: 0.0085536	total: 12.1s	remaining: 15.9s
114:	learn: 0.0084682	total: 12.2s	remaining: 15.8s
115:	learn: 0.0083594	total: 12.3s	remaining: 15.7s
116:	learn: 0.0082598	total: 12.3s	remaining: 15.5s
117:	learn: 0.0081401	total: 12.4s	remaining: 15.4s
118:	learn: 0.0080882	total: 12.5s	remaining: 15.3s
119:	learn: 0.0079655	total: 12.6s	remaining: 15.1s
120:	learn: 0.0078728	total: 12.7s	remaining: 15s
121:	learn: 0.0077968	total: 12.8s	remaining: 14.9s
122:	learn: 0.0077508	total: 12.9s	remaining: 14.7s
123:	learn: 0.0076495	total: 12.9s	remaining: 14.6s
124:	learn: 0.0075672	total: 13s	remaining: 14.5s
125:	learn: 0.0075196	total: 13.1s	remaining: 14.4s
126:	learn: 0.0073895	total: 13.2s	remaining: 14.2s
127:	learn: 0.0072913	total: 13.3s	remaining: 14.1s
128:	learn: 0.0072126	total: 13.4s	remaining: 14s
129:	learn: 0.0071718	total: 13.5s	remaining: 13.9s
130:	learn: 0.0071175	total: 13.6s	remaining: 13.8s
131:	learn: 0.0070329	total: 13.6s	remaining: 13.6s
132:	learn: 0.0069471	total: 13.7s	remaining: 13.5s
133:	learn: 0.0068983	total: 13.8s	remaining: 13.4s
134:	learn: 0.0068335	total: 13.9s	remaining: 13.3s
135:	learn: 0.0067772	total: 14s	remaining: 13.2s
136:	learn: 0.0066826	total: 14.1s	remaining: 13s
137:	learn: 0.0066095	total: 14.2s	remaining: 12.9s
138:	learn: 0.0065581	total: 14.2s	remaining: 12.8s
139:	learn: 0.0064570	total: 14.3s	remaining: 12.7s
140:	learn: 0.0063972	total: 14.4s	remaining: 12.6s
141:	learn: 0.0063711	total: 14.5s	remaining: 12.5s
142:	learn: 0.0062925	total: 14.6s	remaining: 12.4s
143:	learn: 0.0061672	total: 14.7s	remaining: 12.2s
144:	learn: 0.0061526	total: 14.8s	remaining: 12.1s
145:	learn: 0.0061061	total: 14.9s	remaining: 12s
146:	learn: 0.0060682	total: 14.9s	remaining: 11.9s
147:	learn: 0.0060058	total: 15s	remaining: 11.8s
148:	learn: 0.0059498	total: 15.1s	remaining: 11.7s
149:	learn: 0.0058868	total: 15.2s	remaining: 11.5s
150:	learn: 0.0058477	total: 15.3s	remaining: 11.4s
151:	learn: 0.0057419	total: 15.4s	remaining: 11.3s
152:	learn: 0.0056770	total: 15.4s	remaining: 11.2s
153:	learn: 0.0056475	total: 15.5s	remaining: 11.1s
154:	learn: 0.0055992	total: 15.6s	remaining: 11s
155:	learn: 0.0055278	total: 15.7s	remaining: 10.9s
156:	learn: 0.0054941	total: 15.8s	remaining: 10.8s
157:	learn: 0.0054351	total: 15.9s	remaining: 10.6s
158:	learn: 0.0053813	total: 15.9s	remaining: 10.5s
159:	learn: 0.0053362	total: 16s	remaining: 10.4s
160:	learn: 0.0052934	total: 16.1s	remaining: 10.3s
161:	learn: 0.0052465	total: 16.2s	remaining: 10.2s
162:	learn: 0.0051926	total: 16.4s	remaining: 10.2s
163:	learn: 0.0051404	total: 16.5s	remaining: 10.1s
164:	learn: 0.0051087	total: 16.7s	remaining: 10s
165:	learn: 0.0050653	total: 16.8s	remaining: 9.93s
166:	learn: 0.0050260	total: 17s	remaining: 9.85s
167:	learn: 0.0049830	total: 17.1s	remaining: 9.78s
168:	learn: 0.0049371	total: 17.3s	remaining: 9.71s
169:	learn: 0.0049103	total: 17.4s	remaining: 9.63s
170:	learn: 0.0048637	total: 17.6s	remaining: 9.57s
171:	learn: 0.0048213	total: 17.7s	remaining: 9.49s
172:	learn: 0.0047919	total: 17.9s	remaining: 9.42s
173:	learn: 0.0047611	total: 18s	remaining: 9.33s
174:	learn: 0.0047043	total: 18.2s	remaining: 9.26s
175:	learn: 0.0047043	total: 18.3s	remaining: 9.16s
176:	learn: 0.0046644	total: 18.5s	remaining: 9.09s
177:	learn: 0.0046416	total: 18.6s	remaining: 9.01s
178:	learn: 0.0046012	total: 18.8s	remaining: 8.93s
179:	learn: 0.0045777	total: 18.9s	remaining: 8.84s
180:	learn: 0.0045777	total: 19.1s	remaining: 8.74s
181:	learn: 0.0045485	total: 19.2s	remaining: 8.66s
182:	learn: 0.0044843	total: 19.4s	remaining: 8.58s
183:	learn: 0.0044549	total: 19.5s	remaining: 8.49s
184:	learn: 0.0044125	total: 19.7s	remaining: 8.41s
185:	learn: 0.0043929	total: 19.8s	remaining: 8.32s
186:	learn: 0.0043581	total: 20s	remaining: 8.23s
187:	learn: 0.0043413	total: 20.1s	remaining: 8.14s
188:	learn: 0.0043413	total: 20.2s	remaining: 8.04s
189:	learn: 0.0043209	total: 20.4s	remaining: 7.95s
190:	learn: 0.0042865	total: 20.6s	remaining: 7.86s
191:	learn: 0.0042536	total: 20.7s	remaining: 7.77s
192:	learn: 0.0042156	total: 20.9s	remaining: 7.68s
193:	learn: 0.0041798	total: 21s	remaining: 7.59s
194:	learn: 0.0041558	total: 21.2s	remaining: 7.49s
195:	learn: 0.0041377	total: 21.3s	remaining: 7.4s
196:	learn: 0.0041257	total: 21.5s	remaining: 7.3s
197:	learn: 0.0040984	total: 21.6s	remaining: 7.2s
198:	learn: 0.0040774	total: 21.8s	remaining: 7.11s
199:	learn: 0.0040631	total: 21.9s	remaining: 7s
200:	learn: 0.0040438	total: 21.9s	remaining: 6.88s
201:	learn: 0.0040262	total: 22s	remaining: 6.76s
202:	learn: 0.0039898	total: 22.1s	remaining: 6.64s
203:	learn: 0.0039898	total: 22.2s	remaining: 6.52s
204:	learn: 0.0039554	total: 22.3s	remaining: 6.41s
205:	learn: 0.0039382	total: 22.3s	remaining: 6.29s
206:	learn: 0.0039110	total: 22.4s	remaining: 6.17s
207:	learn: 0.0038778	total: 22.5s	remaining: 6.06s
208:	learn: 0.0038456	total: 22.6s	remaining: 5.95s
209:	learn: 0.0038455	total: 22.6s	remaining: 5.82s
210:	learn: 0.0038455	total: 22.7s	remaining: 5.7s
211:	learn: 0.0038387	total: 22.8s	remaining: 5.59s
212:	learn: 0.0038024	total: 22.9s	remaining: 5.48s
213:	learn: 0.0037806	total: 23s	remaining: 5.37s
214:	learn: 0.0037744	total: 23.1s	remaining: 5.25s
215:	learn: 0.0037543	total: 23.1s	remaining: 5.14s
216:	learn: 0.0037543	total: 23.2s	remaining: 5.03s
217:	learn: 0.0037220	total: 23.3s	remaining: 4.91s
218:	learn: 0.0037220	total: 23.4s	remaining: 4.8s
219:	learn: 0.0037220	total: 23.4s	remaining: 4.68s
220:	learn: 0.0037219	total: 23.5s	remaining: 4.57s
221:	learn: 0.0037219	total: 23.6s	remaining: 4.46s
222:	learn: 0.0037086	total: 23.6s	remaining: 4.35s
223:	learn: 0.0036853	total: 23.7s	remaining: 4.24s
224:	learn: 0.0036853	total: 23.8s	remaining: 4.12s
225:	learn: 0.0036852	total: 23.9s	remaining: 4.01s
226:	learn: 0.0036852	total: 23.9s	remaining: 3.9s
227:	learn: 0.0036808	total: 24s	remaining: 3.79s
228:	learn: 0.0036451	total: 24.1s	remaining: 3.68s
229:	learn: 0.0036451	total: 24.2s	remaining: 3.57s
230:	learn: 0.0036450	total: 24.2s	remaining: 3.46s
231:	learn: 0.0036450	total: 24.3s	remaining: 3.35s
232:	learn: 0.0036450	total: 24.4s	remaining: 3.24s
233:	learn: 0.0036450	total: 24.4s	remaining: 3.13s
234:	learn: 0.0036450	total: 24.5s	remaining: 3.02s
235:	learn: 0.0036450	total: 24.5s	remaining: 2.91s
236:	learn: 0.0036450	total: 24.6s	remaining: 2.8s
237:	learn: 0.0036450	total: 24.7s	remaining: 2.7s
238:	learn: 0.0036450	total: 24.8s	remaining: 2.59s
239:	learn: 0.0036450	total: 24.8s	remaining: 2.48s
240:	learn: 0.0036450	total: 24.9s	remaining: 2.38s
241:	learn: 0.0036450	total: 25s	remaining: 2.27s
242:	learn: 0.0036450	total: 25s	remaining: 2.16s
243:	learn: 0.0036450	total: 25.1s	remaining: 2.06s
244:	learn: 0.0036450	total: 25.2s	remaining: 1.95s
245:	learn: 0.0036450	total: 25.2s	remaining: 1.85s
246:	learn: 0.0036450	total: 25.3s	remaining: 1.74s
247:	learn: 0.0036450	total: 25.4s	remaining: 1.64s
248:	learn: 0.0036450	total: 25.4s	remaining: 1.53s
249:	learn: 0.0036450	total: 25.5s	remaining: 1.43s
250:	learn: 0.0036450	total: 25.6s	remaining: 1.32s
251:	learn: 0.0036450	total: 25.6s	remaining: 1.22s
252:	learn: 0.0036450	total: 25.7s	remaining: 1.12s
253:	learn: 0.0036449	total: 25.8s	remaining: 1.01s
254:	learn: 0.0036449	total: 25.8s	remaining: 912ms
255:	learn: 0.0036449	total: 25.9s	remaining: 810ms
256:	learn: 0.0036449	total: 26s	remaining: 708ms
257:	learn: 0.0036449	total: 26.1s	remaining: 606ms
258:	learn: 0.0036449	total: 26.1s	remaining: 504ms
259:	learn: 0.0036449	total: 26.2s	remaining: 403ms
260:	learn: 0.0036449	total: 26.3s	remaining: 302ms
261:	learn: 0.0036449	total: 26.3s	remaining: 201ms
262:	learn: 0.0036449	total: 26.4s	remaining: 100ms
263:	learn: 0.0036449	total: 26.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.52
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.76
 - F1-Score_Train: 99.76
 - Precision_Test: 19.60
 - Recall_Test: 85.71
 - AUPRC_Test: 75.48
 - Accuracy_Test: 99.38
 - F1-Score_Test: 31.91
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5423557	total: 74.8ms	remaining: 19.7s
1:	learn: 0.4344608	total: 148ms	remaining: 19.3s
2:	learn: 0.3422554	total: 230ms	remaining: 20s
3:	learn: 0.2753539	total: 324ms	remaining: 21.1s
4:	learn: 0.2306095	total: 403ms	remaining: 20.9s
5:	learn: 0.1933691	total: 504ms	remaining: 21.7s
6:	learn: 0.1733145	total: 596ms	remaining: 21.9s
7:	learn: 0.1543653	total: 678ms	remaining: 21.7s
8:	learn: 0.1448106	total: 751ms	remaining: 21.3s
9:	learn: 0.1330103	total: 856ms	remaining: 21.7s
10:	learn: 0.1194575	total: 941ms	remaining: 21.6s
11:	learn: 0.1102753	total: 1.02s	remaining: 21.5s
12:	learn: 0.1025283	total: 1.12s	remaining: 21.6s
13:	learn: 0.0976816	total: 1.2s	remaining: 21.4s
14:	learn: 0.0919364	total: 1.28s	remaining: 21.3s
15:	learn: 0.0870012	total: 1.38s	remaining: 21.4s
16:	learn: 0.0825342	total: 1.48s	remaining: 21.5s
17:	learn: 0.0791579	total: 1.55s	remaining: 21.2s
18:	learn: 0.0762132	total: 1.64s	remaining: 21.1s
19:	learn: 0.0744245	total: 1.71s	remaining: 20.9s
20:	learn: 0.0716626	total: 1.79s	remaining: 20.8s
21:	learn: 0.0701050	total: 1.89s	remaining: 20.8s
22:	learn: 0.0676787	total: 1.98s	remaining: 20.7s
23:	learn: 0.0649323	total: 2.05s	remaining: 20.5s
24:	learn: 0.0627058	total: 2.15s	remaining: 20.6s
25:	learn: 0.0613936	total: 2.21s	remaining: 20.3s
26:	learn: 0.0600464	total: 2.29s	remaining: 20.1s
27:	learn: 0.0572391	total: 2.38s	remaining: 20.1s
28:	learn: 0.0558349	total: 2.48s	remaining: 20.1s
29:	learn: 0.0543462	total: 2.55s	remaining: 19.9s
30:	learn: 0.0531823	total: 2.65s	remaining: 19.9s
31:	learn: 0.0517556	total: 2.73s	remaining: 19.8s
32:	learn: 0.0502123	total: 2.81s	remaining: 19.6s
33:	learn: 0.0488521	total: 2.9s	remaining: 19.6s
34:	learn: 0.0473785	total: 2.97s	remaining: 19.5s
35:	learn: 0.0460313	total: 3.05s	remaining: 19.3s
36:	learn: 0.0451445	total: 3.15s	remaining: 19.3s
37:	learn: 0.0439935	total: 3.27s	remaining: 19.5s
38:	learn: 0.0426984	total: 3.41s	remaining: 19.7s
39:	learn: 0.0417412	total: 3.58s	remaining: 20s
40:	learn: 0.0407814	total: 3.7s	remaining: 20.1s
41:	learn: 0.0396510	total: 3.86s	remaining: 20.4s
42:	learn: 0.0386845	total: 4.01s	remaining: 20.6s
43:	learn: 0.0381763	total: 4.17s	remaining: 20.8s
44:	learn: 0.0373083	total: 4.33s	remaining: 21.1s
45:	learn: 0.0365000	total: 4.49s	remaining: 21.3s
46:	learn: 0.0358888	total: 4.67s	remaining: 21.5s
47:	learn: 0.0352321	total: 4.81s	remaining: 21.6s
48:	learn: 0.0345768	total: 4.99s	remaining: 21.9s
49:	learn: 0.0335874	total: 5.15s	remaining: 22s
50:	learn: 0.0328551	total: 5.29s	remaining: 22.1s
51:	learn: 0.0320255	total: 5.44s	remaining: 22.2s
52:	learn: 0.0314260	total: 5.65s	remaining: 22.5s
53:	learn: 0.0310121	total: 5.8s	remaining: 22.6s
54:	learn: 0.0306514	total: 5.96s	remaining: 22.6s
55:	learn: 0.0300791	total: 6.11s	remaining: 22.7s
56:	learn: 0.0296141	total: 6.3s	remaining: 22.9s
57:	learn: 0.0290972	total: 6.44s	remaining: 22.9s
58:	learn: 0.0283491	total: 6.59s	remaining: 22.9s
59:	learn: 0.0278096	total: 6.72s	remaining: 22.9s
60:	learn: 0.0272373	total: 6.87s	remaining: 22.9s
61:	learn: 0.0266645	total: 7.02s	remaining: 22.9s
62:	learn: 0.0260899	total: 7.16s	remaining: 22.8s
63:	learn: 0.0255285	total: 7.3s	remaining: 22.8s
64:	learn: 0.0251337	total: 7.48s	remaining: 22.9s
65:	learn: 0.0245261	total: 7.62s	remaining: 22.9s
66:	learn: 0.0241893	total: 7.79s	remaining: 22.9s
67:	learn: 0.0239438	total: 7.93s	remaining: 22.9s
68:	learn: 0.0235622	total: 8.1s	remaining: 22.9s
69:	learn: 0.0232764	total: 8.24s	remaining: 22.8s
70:	learn: 0.0229162	total: 8.42s	remaining: 22.9s
71:	learn: 0.0223023	total: 8.57s	remaining: 22.9s
72:	learn: 0.0220103	total: 8.73s	remaining: 22.9s
73:	learn: 0.0216699	total: 8.85s	remaining: 22.7s
74:	learn: 0.0213101	total: 8.93s	remaining: 22.5s
75:	learn: 0.0210242	total: 9.02s	remaining: 22.3s
76:	learn: 0.0208335	total: 9.1s	remaining: 22.1s
77:	learn: 0.0206299	total: 9.17s	remaining: 21.9s
78:	learn: 0.0203377	total: 9.26s	remaining: 21.7s
79:	learn: 0.0200332	total: 9.34s	remaining: 21.5s
80:	learn: 0.0198059	total: 9.42s	remaining: 21.3s
81:	learn: 0.0195344	total: 9.51s	remaining: 21.1s
82:	learn: 0.0192471	total: 9.58s	remaining: 20.9s
83:	learn: 0.0188570	total: 9.66s	remaining: 20.7s
84:	learn: 0.0187065	total: 9.76s	remaining: 20.5s
85:	learn: 0.0183327	total: 9.85s	remaining: 20.4s
86:	learn: 0.0179835	total: 9.93s	remaining: 20.2s
87:	learn: 0.0177167	total: 10s	remaining: 20s
88:	learn: 0.0175040	total: 10.1s	remaining: 19.9s
89:	learn: 0.0173016	total: 10.2s	remaining: 19.7s
90:	learn: 0.0171412	total: 10.3s	remaining: 19.5s
91:	learn: 0.0169272	total: 10.3s	remaining: 19.3s
92:	learn: 0.0164991	total: 10.4s	remaining: 19.2s
93:	learn: 0.0163595	total: 10.5s	remaining: 19.1s
94:	learn: 0.0162068	total: 10.6s	remaining: 18.9s
95:	learn: 0.0160549	total: 10.7s	remaining: 18.7s
96:	learn: 0.0157470	total: 10.8s	remaining: 18.6s
97:	learn: 0.0156152	total: 10.9s	remaining: 18.4s
98:	learn: 0.0153356	total: 11s	remaining: 18.3s
99:	learn: 0.0151426	total: 11.1s	remaining: 18.1s
100:	learn: 0.0149591	total: 11.1s	remaining: 18s
101:	learn: 0.0148032	total: 11.2s	remaining: 17.8s
102:	learn: 0.0145941	total: 11.3s	remaining: 17.7s
103:	learn: 0.0143992	total: 11.4s	remaining: 17.5s
104:	learn: 0.0142864	total: 11.5s	remaining: 17.3s
105:	learn: 0.0140883	total: 11.5s	remaining: 17.2s
106:	learn: 0.0139628	total: 11.6s	remaining: 17.1s
107:	learn: 0.0137742	total: 11.7s	remaining: 16.9s
108:	learn: 0.0136516	total: 11.8s	remaining: 16.8s
109:	learn: 0.0134997	total: 11.9s	remaining: 16.7s
110:	learn: 0.0133191	total: 12s	remaining: 16.5s
111:	learn: 0.0132019	total: 12.1s	remaining: 16.4s
112:	learn: 0.0129977	total: 12.1s	remaining: 16.2s
113:	learn: 0.0128489	total: 12.2s	remaining: 16.1s
114:	learn: 0.0126905	total: 12.3s	remaining: 16s
115:	learn: 0.0125419	total: 12.4s	remaining: 15.8s
116:	learn: 0.0123620	total: 12.5s	remaining: 15.7s
117:	learn: 0.0122554	total: 12.6s	remaining: 15.6s
118:	learn: 0.0121173	total: 12.7s	remaining: 15.4s
119:	learn: 0.0120073	total: 12.7s	remaining: 15.3s
120:	learn: 0.0119509	total: 12.8s	remaining: 15.2s
121:	learn: 0.0118718	total: 12.9s	remaining: 15s
122:	learn: 0.0118006	total: 13s	remaining: 14.9s
123:	learn: 0.0116715	total: 13.1s	remaining: 14.8s
124:	learn: 0.0114968	total: 13.2s	remaining: 14.6s
125:	learn: 0.0113227	total: 13.3s	remaining: 14.5s
126:	learn: 0.0112006	total: 13.3s	remaining: 14.4s
127:	learn: 0.0110826	total: 13.4s	remaining: 14.3s
128:	learn: 0.0109899	total: 13.5s	remaining: 14.1s
129:	learn: 0.0109083	total: 13.6s	remaining: 14s
130:	learn: 0.0108226	total: 13.7s	remaining: 13.9s
131:	learn: 0.0106859	total: 13.7s	remaining: 13.7s
132:	learn: 0.0105580	total: 13.8s	remaining: 13.6s
133:	learn: 0.0104070	total: 13.9s	remaining: 13.5s
134:	learn: 0.0103401	total: 14s	remaining: 13.4s
135:	learn: 0.0102504	total: 14.1s	remaining: 13.3s
136:	learn: 0.0101939	total: 14.2s	remaining: 13.1s
137:	learn: 0.0101104	total: 14.3s	remaining: 13s
138:	learn: 0.0099957	total: 14.3s	remaining: 12.9s
139:	learn: 0.0098987	total: 14.4s	remaining: 12.8s
140:	learn: 0.0097790	total: 14.5s	remaining: 12.7s
141:	learn: 0.0096436	total: 14.6s	remaining: 12.5s
142:	learn: 0.0094485	total: 14.7s	remaining: 12.4s
143:	learn: 0.0094049	total: 14.8s	remaining: 12.3s
144:	learn: 0.0093101	total: 14.9s	remaining: 12.2s
145:	learn: 0.0092551	total: 14.9s	remaining: 12.1s
146:	learn: 0.0091115	total: 15s	remaining: 12s
147:	learn: 0.0090031	total: 15.1s	remaining: 11.9s
148:	learn: 0.0088813	total: 15.2s	remaining: 11.7s
149:	learn: 0.0087831	total: 15.3s	remaining: 11.6s
150:	learn: 0.0087567	total: 15.4s	remaining: 11.5s
151:	learn: 0.0086918	total: 15.5s	remaining: 11.4s
152:	learn: 0.0086531	total: 15.5s	remaining: 11.3s
153:	learn: 0.0085893	total: 15.6s	remaining: 11.2s
154:	learn: 0.0085241	total: 15.7s	remaining: 11s
155:	learn: 0.0084784	total: 15.8s	remaining: 10.9s
156:	learn: 0.0083753	total: 15.9s	remaining: 10.8s
157:	learn: 0.0083043	total: 15.9s	remaining: 10.7s
158:	learn: 0.0082526	total: 16s	remaining: 10.6s
159:	learn: 0.0081475	total: 16.1s	remaining: 10.5s
160:	learn: 0.0080954	total: 16.2s	remaining: 10.4s
161:	learn: 0.0080479	total: 16.3s	remaining: 10.3s
162:	learn: 0.0079650	total: 16.4s	remaining: 10.1s
163:	learn: 0.0079087	total: 16.5s	remaining: 10s
164:	learn: 0.0077954	total: 16.5s	remaining: 9.92s
165:	learn: 0.0077282	total: 16.6s	remaining: 9.82s
166:	learn: 0.0076999	total: 16.7s	remaining: 9.71s
167:	learn: 0.0076161	total: 16.8s	remaining: 9.6s
168:	learn: 0.0075415	total: 16.9s	remaining: 9.49s
169:	learn: 0.0074604	total: 17s	remaining: 9.38s
170:	learn: 0.0074005	total: 17.1s	remaining: 9.28s
171:	learn: 0.0073713	total: 17.1s	remaining: 9.17s
172:	learn: 0.0073131	total: 17.2s	remaining: 9.05s
173:	learn: 0.0072198	total: 17.3s	remaining: 8.95s
174:	learn: 0.0071855	total: 17.4s	remaining: 8.85s
175:	learn: 0.0071392	total: 17.5s	remaining: 8.74s
176:	learn: 0.0070638	total: 17.5s	remaining: 8.63s
177:	learn: 0.0069725	total: 17.6s	remaining: 8.52s
178:	learn: 0.0069234	total: 17.7s	remaining: 8.41s
179:	learn: 0.0068603	total: 17.8s	remaining: 8.3s
180:	learn: 0.0068114	total: 17.9s	remaining: 8.2s
181:	learn: 0.0067889	total: 17.9s	remaining: 8.09s
182:	learn: 0.0067169	total: 18s	remaining: 7.99s
183:	learn: 0.0066434	total: 18.1s	remaining: 7.88s
184:	learn: 0.0065908	total: 18.2s	remaining: 7.77s
185:	learn: 0.0065492	total: 18.3s	remaining: 7.67s
186:	learn: 0.0065186	total: 18.4s	remaining: 7.57s
187:	learn: 0.0064756	total: 18.5s	remaining: 7.46s
188:	learn: 0.0064440	total: 18.5s	remaining: 7.35s
189:	learn: 0.0063892	total: 18.6s	remaining: 7.25s
190:	learn: 0.0063359	total: 18.7s	remaining: 7.14s
191:	learn: 0.0062759	total: 18.8s	remaining: 7.05s
192:	learn: 0.0062332	total: 18.9s	remaining: 6.97s
193:	learn: 0.0061601	total: 19.1s	remaining: 6.9s
194:	learn: 0.0061357	total: 19.3s	remaining: 6.81s
195:	learn: 0.0061171	total: 19.4s	remaining: 6.73s
196:	learn: 0.0060585	total: 19.6s	remaining: 6.66s
197:	learn: 0.0059808	total: 19.7s	remaining: 6.58s
198:	learn: 0.0059633	total: 19.9s	remaining: 6.49s
199:	learn: 0.0059157	total: 20s	remaining: 6.41s
200:	learn: 0.0058402	total: 20.2s	remaining: 6.33s
201:	learn: 0.0058329	total: 20.3s	remaining: 6.24s
202:	learn: 0.0058082	total: 20.5s	remaining: 6.16s
203:	learn: 0.0057385	total: 20.6s	remaining: 6.06s
204:	learn: 0.0057385	total: 20.7s	remaining: 5.96s
205:	learn: 0.0056645	total: 20.9s	remaining: 5.87s
206:	learn: 0.0056421	total: 21s	remaining: 5.78s
207:	learn: 0.0055832	total: 21.1s	remaining: 5.69s
208:	learn: 0.0055474	total: 21.3s	remaining: 5.6s
209:	learn: 0.0055213	total: 21.4s	remaining: 5.51s
210:	learn: 0.0054518	total: 21.6s	remaining: 5.42s
211:	learn: 0.0053777	total: 21.8s	remaining: 5.34s
212:	learn: 0.0053188	total: 21.9s	remaining: 5.25s
213:	learn: 0.0053124	total: 22.1s	remaining: 5.15s
214:	learn: 0.0052841	total: 22.2s	remaining: 5.06s
215:	learn: 0.0052579	total: 22.3s	remaining: 4.96s
216:	learn: 0.0052195	total: 22.5s	remaining: 4.87s
217:	learn: 0.0051690	total: 22.6s	remaining: 4.78s
218:	learn: 0.0051327	total: 22.8s	remaining: 4.68s
219:	learn: 0.0051327	total: 22.9s	remaining: 4.58s
220:	learn: 0.0051327	total: 23s	remaining: 4.48s
221:	learn: 0.0051222	total: 23.1s	remaining: 4.38s
222:	learn: 0.0051058	total: 23.3s	remaining: 4.29s
223:	learn: 0.0050815	total: 23.5s	remaining: 4.19s
224:	learn: 0.0050463	total: 23.6s	remaining: 4.09s
225:	learn: 0.0050342	total: 23.8s	remaining: 3.99s
226:	learn: 0.0049980	total: 23.9s	remaining: 3.9s
227:	learn: 0.0049979	total: 24s	remaining: 3.79s
228:	learn: 0.0049668	total: 24.1s	remaining: 3.69s
229:	learn: 0.0049667	total: 24.2s	remaining: 3.58s
230:	learn: 0.0049667	total: 24.3s	remaining: 3.47s
231:	learn: 0.0049667	total: 24.4s	remaining: 3.36s
232:	learn: 0.0049667	total: 24.4s	remaining: 3.25s
233:	learn: 0.0049667	total: 24.5s	remaining: 3.14s
234:	learn: 0.0049666	total: 24.6s	remaining: 3.03s
235:	learn: 0.0049518	total: 24.6s	remaining: 2.92s
236:	learn: 0.0048920	total: 24.7s	remaining: 2.82s
237:	learn: 0.0048539	total: 24.8s	remaining: 2.71s
238:	learn: 0.0048383	total: 24.9s	remaining: 2.6s
239:	learn: 0.0048382	total: 25s	remaining: 2.5s
240:	learn: 0.0048270	total: 25s	remaining: 2.39s
241:	learn: 0.0047880	total: 25.1s	remaining: 2.28s
242:	learn: 0.0047463	total: 25.2s	remaining: 2.18s
243:	learn: 0.0047414	total: 25.3s	remaining: 2.07s
244:	learn: 0.0047383	total: 25.4s	remaining: 1.97s
245:	learn: 0.0047262	total: 25.4s	remaining: 1.86s
246:	learn: 0.0047044	total: 25.5s	remaining: 1.76s
247:	learn: 0.0046743	total: 25.6s	remaining: 1.65s
248:	learn: 0.0046743	total: 25.7s	remaining: 1.55s
249:	learn: 0.0046743	total: 25.7s	remaining: 1.44s
250:	learn: 0.0046688	total: 25.8s	remaining: 1.34s
251:	learn: 0.0046688	total: 25.9s	remaining: 1.23s
252:	learn: 0.0046688	total: 26s	remaining: 1.13s
253:	learn: 0.0046688	total: 26s	remaining: 1.02s
254:	learn: 0.0046688	total: 26.1s	remaining: 921ms
255:	learn: 0.0046687	total: 26.2s	remaining: 818ms
256:	learn: 0.0046519	total: 26.2s	remaining: 715ms
257:	learn: 0.0046244	total: 26.3s	remaining: 612ms
258:	learn: 0.0046040	total: 26.4s	remaining: 510ms
259:	learn: 0.0045856	total: 26.5s	remaining: 408ms
260:	learn: 0.0045632	total: 26.6s	remaining: 305ms
261:	learn: 0.0045399	total: 26.7s	remaining: 203ms
262:	learn: 0.0045399	total: 26.7s	remaining: 102ms
263:	learn: 0.0045398	total: 26.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.42
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.71
 - F1-Score_Train: 99.71
 - Precision_Test: 19.61
 - Recall_Test: 88.10
 - AUPRC_Test: 75.09
 - Accuracy_Test: 99.37
 - F1-Score_Test: 32.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5350895	total: 89.7ms	remaining: 23.6s
1:	learn: 0.4036828	total: 167ms	remaining: 21.9s
2:	learn: 0.3265101	total: 242ms	remaining: 21s
3:	learn: 0.2687210	total: 328ms	remaining: 21.3s
4:	learn: 0.2146894	total: 404ms	remaining: 20.9s
5:	learn: 0.1871147	total: 474ms	remaining: 20.4s
6:	learn: 0.1615733	total: 567ms	remaining: 20.8s
7:	learn: 0.1458368	total: 638ms	remaining: 20.4s
8:	learn: 0.1312562	total: 714ms	remaining: 20.2s
9:	learn: 0.1207306	total: 803ms	remaining: 20.4s
10:	learn: 0.1096859	total: 884ms	remaining: 20.3s
11:	learn: 0.0980972	total: 969ms	remaining: 20.4s
12:	learn: 0.0906957	total: 1.06s	remaining: 20.5s
13:	learn: 0.0855934	total: 1.16s	remaining: 20.7s
14:	learn: 0.0823075	total: 1.23s	remaining: 20.4s
15:	learn: 0.0786139	total: 1.33s	remaining: 20.6s
16:	learn: 0.0745337	total: 1.41s	remaining: 20.5s
17:	learn: 0.0714574	total: 1.49s	remaining: 20.4s
18:	learn: 0.0687321	total: 1.58s	remaining: 20.4s
19:	learn: 0.0660929	total: 1.66s	remaining: 20.3s
20:	learn: 0.0632984	total: 1.74s	remaining: 20.2s
21:	learn: 0.0609449	total: 1.83s	remaining: 20.1s
22:	learn: 0.0589651	total: 1.91s	remaining: 20s
23:	learn: 0.0561918	total: 1.99s	remaining: 19.9s
24:	learn: 0.0536419	total: 2.09s	remaining: 19.9s
25:	learn: 0.0518734	total: 2.18s	remaining: 20s
26:	learn: 0.0502393	total: 2.26s	remaining: 19.9s
27:	learn: 0.0491679	total: 2.35s	remaining: 19.8s
28:	learn: 0.0482626	total: 2.43s	remaining: 19.7s
29:	learn: 0.0465684	total: 2.51s	remaining: 19.6s
30:	learn: 0.0454364	total: 2.61s	remaining: 19.6s
31:	learn: 0.0442850	total: 2.69s	remaining: 19.5s
32:	learn: 0.0431115	total: 2.77s	remaining: 19.4s
33:	learn: 0.0424595	total: 2.87s	remaining: 19.4s
34:	learn: 0.0413281	total: 2.96s	remaining: 19.3s
35:	learn: 0.0404798	total: 3.03s	remaining: 19.2s
36:	learn: 0.0394753	total: 3.12s	remaining: 19.2s
37:	learn: 0.0384670	total: 3.21s	remaining: 19.1s
38:	learn: 0.0378608	total: 3.29s	remaining: 19s
39:	learn: 0.0373990	total: 3.37s	remaining: 18.9s
40:	learn: 0.0367494	total: 3.45s	remaining: 18.8s
41:	learn: 0.0358876	total: 3.54s	remaining: 18.7s
42:	learn: 0.0349459	total: 3.63s	remaining: 18.7s
43:	learn: 0.0340998	total: 3.71s	remaining: 18.6s
44:	learn: 0.0332524	total: 3.8s	remaining: 18.5s
45:	learn: 0.0324327	total: 3.9s	remaining: 18.5s
46:	learn: 0.0316264	total: 3.98s	remaining: 18.4s
47:	learn: 0.0311821	total: 4.06s	remaining: 18.3s
48:	learn: 0.0305718	total: 4.16s	remaining: 18.2s
49:	learn: 0.0299415	total: 4.24s	remaining: 18.2s
50:	learn: 0.0293294	total: 4.33s	remaining: 18.1s
51:	learn: 0.0286512	total: 4.42s	remaining: 18s
52:	learn: 0.0283040	total: 4.5s	remaining: 17.9s
53:	learn: 0.0278931	total: 4.57s	remaining: 17.8s
54:	learn: 0.0272253	total: 4.69s	remaining: 17.8s
55:	learn: 0.0267774	total: 4.82s	remaining: 17.9s
56:	learn: 0.0264485	total: 4.97s	remaining: 18s
57:	learn: 0.0258543	total: 5.11s	remaining: 18.2s
58:	learn: 0.0251877	total: 5.28s	remaining: 18.4s
59:	learn: 0.0247480	total: 5.43s	remaining: 18.5s
60:	learn: 0.0243456	total: 5.6s	remaining: 18.6s
61:	learn: 0.0239131	total: 5.75s	remaining: 18.7s
62:	learn: 0.0236212	total: 5.92s	remaining: 18.9s
63:	learn: 0.0231473	total: 6.07s	remaining: 19s
64:	learn: 0.0229460	total: 6.22s	remaining: 19.1s
65:	learn: 0.0226072	total: 6.38s	remaining: 19.1s
66:	learn: 0.0222785	total: 6.52s	remaining: 19.2s
67:	learn: 0.0219325	total: 6.67s	remaining: 19.2s
68:	learn: 0.0215254	total: 6.82s	remaining: 19.3s
69:	learn: 0.0210269	total: 6.99s	remaining: 19.4s
70:	learn: 0.0208103	total: 7.15s	remaining: 19.4s
71:	learn: 0.0204212	total: 7.31s	remaining: 19.5s
72:	learn: 0.0198797	total: 7.5s	remaining: 19.6s
73:	learn: 0.0195308	total: 7.66s	remaining: 19.7s
74:	learn: 0.0192505	total: 7.83s	remaining: 19.7s
75:	learn: 0.0188507	total: 7.99s	remaining: 19.8s
76:	learn: 0.0185332	total: 8.13s	remaining: 19.7s
77:	learn: 0.0182895	total: 8.28s	remaining: 19.7s
78:	learn: 0.0181063	total: 8.42s	remaining: 19.7s
79:	learn: 0.0177481	total: 8.56s	remaining: 19.7s
80:	learn: 0.0176391	total: 8.71s	remaining: 19.7s
81:	learn: 0.0175402	total: 8.86s	remaining: 19.7s
82:	learn: 0.0173323	total: 9.03s	remaining: 19.7s
83:	learn: 0.0170698	total: 9.17s	remaining: 19.7s
84:	learn: 0.0168171	total: 9.34s	remaining: 19.7s
85:	learn: 0.0165523	total: 9.51s	remaining: 19.7s
86:	learn: 0.0163105	total: 9.67s	remaining: 19.7s
87:	learn: 0.0160244	total: 9.83s	remaining: 19.7s
88:	learn: 0.0158076	total: 9.99s	remaining: 19.6s
89:	learn: 0.0155685	total: 10.1s	remaining: 19.6s
90:	learn: 0.0153444	total: 10.3s	remaining: 19.5s
91:	learn: 0.0150591	total: 10.3s	remaining: 19.3s
92:	learn: 0.0149205	total: 10.4s	remaining: 19.2s
93:	learn: 0.0148402	total: 10.5s	remaining: 19s
94:	learn: 0.0145831	total: 10.6s	remaining: 18.8s
95:	learn: 0.0144450	total: 10.7s	remaining: 18.7s
96:	learn: 0.0142247	total: 10.8s	remaining: 18.5s
97:	learn: 0.0139937	total: 10.8s	remaining: 18.4s
98:	learn: 0.0138188	total: 10.9s	remaining: 18.2s
99:	learn: 0.0135789	total: 11s	remaining: 18.1s
100:	learn: 0.0134306	total: 11.1s	remaining: 17.9s
101:	learn: 0.0132382	total: 11.2s	remaining: 17.7s
102:	learn: 0.0130283	total: 11.3s	remaining: 17.6s
103:	learn: 0.0128043	total: 11.3s	remaining: 17.5s
104:	learn: 0.0126644	total: 11.4s	remaining: 17.3s
105:	learn: 0.0125292	total: 11.5s	remaining: 17.2s
106:	learn: 0.0123912	total: 11.6s	remaining: 17.1s
107:	learn: 0.0122235	total: 11.7s	remaining: 16.9s
108:	learn: 0.0120718	total: 11.8s	remaining: 16.8s
109:	learn: 0.0119901	total: 11.9s	remaining: 16.6s
110:	learn: 0.0118304	total: 12s	remaining: 16.5s
111:	learn: 0.0117347	total: 12.1s	remaining: 16.4s
112:	learn: 0.0115859	total: 12.1s	remaining: 16.2s
113:	learn: 0.0113792	total: 12.2s	remaining: 16.1s
114:	learn: 0.0112818	total: 12.3s	remaining: 16s
115:	learn: 0.0111472	total: 12.4s	remaining: 15.8s
116:	learn: 0.0110146	total: 12.5s	remaining: 15.7s
117:	learn: 0.0109133	total: 12.6s	remaining: 15.6s
118:	learn: 0.0107461	total: 12.7s	remaining: 15.4s
119:	learn: 0.0106407	total: 12.7s	remaining: 15.3s
120:	learn: 0.0105187	total: 12.8s	remaining: 15.2s
121:	learn: 0.0103831	total: 12.9s	remaining: 15s
122:	learn: 0.0102995	total: 13s	remaining: 14.9s
123:	learn: 0.0102206	total: 13.1s	remaining: 14.8s
124:	learn: 0.0100959	total: 13.2s	remaining: 14.6s
125:	learn: 0.0099485	total: 13.2s	remaining: 14.5s
126:	learn: 0.0098544	total: 13.3s	remaining: 14.4s
127:	learn: 0.0097656	total: 13.4s	remaining: 14.3s
128:	learn: 0.0096425	total: 13.5s	remaining: 14.1s
129:	learn: 0.0094897	total: 13.6s	remaining: 14s
130:	learn: 0.0093750	total: 13.7s	remaining: 13.9s
131:	learn: 0.0092661	total: 13.8s	remaining: 13.8s
132:	learn: 0.0091829	total: 13.9s	remaining: 13.7s
133:	learn: 0.0090965	total: 13.9s	remaining: 13.5s
134:	learn: 0.0089622	total: 14s	remaining: 13.4s
135:	learn: 0.0088556	total: 14.1s	remaining: 13.3s
136:	learn: 0.0087524	total: 14.2s	remaining: 13.2s
137:	learn: 0.0086689	total: 14.3s	remaining: 13.1s
138:	learn: 0.0085935	total: 14.4s	remaining: 12.9s
139:	learn: 0.0084869	total: 14.5s	remaining: 12.8s
140:	learn: 0.0083269	total: 14.6s	remaining: 12.7s
141:	learn: 0.0082315	total: 14.7s	remaining: 12.6s
142:	learn: 0.0081617	total: 14.7s	remaining: 12.5s
143:	learn: 0.0081311	total: 14.8s	remaining: 12.3s
144:	learn: 0.0080938	total: 14.9s	remaining: 12.2s
145:	learn: 0.0080386	total: 15s	remaining: 12.1s
146:	learn: 0.0079042	total: 15.1s	remaining: 12s
147:	learn: 0.0078416	total: 15.1s	remaining: 11.9s
148:	learn: 0.0077600	total: 15.2s	remaining: 11.7s
149:	learn: 0.0077116	total: 15.3s	remaining: 11.6s
150:	learn: 0.0076449	total: 15.4s	remaining: 11.5s
151:	learn: 0.0075508	total: 15.5s	remaining: 11.4s
152:	learn: 0.0074567	total: 15.6s	remaining: 11.3s
153:	learn: 0.0073931	total: 15.8s	remaining: 11.3s
154:	learn: 0.0073355	total: 15.9s	remaining: 11.2s
155:	learn: 0.0073020	total: 16s	remaining: 11.1s
156:	learn: 0.0072397	total: 16.1s	remaining: 11s
157:	learn: 0.0071799	total: 16.3s	remaining: 10.9s
158:	learn: 0.0071124	total: 16.4s	remaining: 10.8s
159:	learn: 0.0070134	total: 16.4s	remaining: 10.7s
160:	learn: 0.0069264	total: 16.5s	remaining: 10.6s
161:	learn: 0.0068770	total: 16.6s	remaining: 10.5s
162:	learn: 0.0068475	total: 16.7s	remaining: 10.3s
163:	learn: 0.0068005	total: 16.8s	remaining: 10.2s
164:	learn: 0.0067598	total: 16.9s	remaining: 10.1s
165:	learn: 0.0067002	total: 17s	remaining: 10s
166:	learn: 0.0066404	total: 17.1s	remaining: 9.91s
167:	learn: 0.0065982	total: 17.1s	remaining: 9.79s
168:	learn: 0.0065843	total: 17.2s	remaining: 9.67s
169:	learn: 0.0065363	total: 17.3s	remaining: 9.56s
170:	learn: 0.0064352	total: 17.4s	remaining: 9.45s
171:	learn: 0.0063816	total: 17.4s	remaining: 9.33s
172:	learn: 0.0063010	total: 17.5s	remaining: 9.22s
173:	learn: 0.0062827	total: 17.6s	remaining: 9.1s
174:	learn: 0.0062023	total: 17.7s	remaining: 8.99s
175:	learn: 0.0061719	total: 17.8s	remaining: 8.89s
176:	learn: 0.0061167	total: 17.9s	remaining: 8.78s
177:	learn: 0.0060809	total: 17.9s	remaining: 8.67s
178:	learn: 0.0060303	total: 18s	remaining: 8.56s
179:	learn: 0.0059782	total: 18.1s	remaining: 8.46s
180:	learn: 0.0058688	total: 18.2s	remaining: 8.35s
181:	learn: 0.0057988	total: 18.3s	remaining: 8.25s
182:	learn: 0.0057500	total: 18.4s	remaining: 8.13s
183:	learn: 0.0057193	total: 18.4s	remaining: 8.02s
184:	learn: 0.0056424	total: 18.5s	remaining: 7.92s
185:	learn: 0.0056093	total: 18.6s	remaining: 7.81s
186:	learn: 0.0055799	total: 18.7s	remaining: 7.7s
187:	learn: 0.0055435	total: 18.8s	remaining: 7.6s
188:	learn: 0.0054829	total: 18.9s	remaining: 7.5s
189:	learn: 0.0054456	total: 19s	remaining: 7.39s
190:	learn: 0.0054224	total: 19s	remaining: 7.28s
191:	learn: 0.0053769	total: 19.1s	remaining: 7.17s
192:	learn: 0.0053520	total: 19.2s	remaining: 7.06s
193:	learn: 0.0052864	total: 19.3s	remaining: 6.97s
194:	learn: 0.0052746	total: 19.4s	remaining: 6.86s
195:	learn: 0.0052272	total: 19.5s	remaining: 6.75s
196:	learn: 0.0051698	total: 19.6s	remaining: 6.66s
197:	learn: 0.0051451	total: 19.6s	remaining: 6.55s
198:	learn: 0.0051032	total: 19.7s	remaining: 6.44s
199:	learn: 0.0050487	total: 19.8s	remaining: 6.35s
200:	learn: 0.0050144	total: 19.9s	remaining: 6.24s
201:	learn: 0.0049766	total: 20s	remaining: 6.14s
202:	learn: 0.0049236	total: 20.1s	remaining: 6.04s
203:	learn: 0.0049131	total: 20.2s	remaining: 5.93s
204:	learn: 0.0048868	total: 20.3s	remaining: 5.84s
205:	learn: 0.0048322	total: 20.5s	remaining: 5.76s
206:	learn: 0.0047909	total: 20.6s	remaining: 5.68s
207:	learn: 0.0047724	total: 20.8s	remaining: 5.59s
208:	learn: 0.0047630	total: 20.9s	remaining: 5.51s
209:	learn: 0.0047075	total: 21.1s	remaining: 5.42s
210:	learn: 0.0046857	total: 21.2s	remaining: 5.34s
211:	learn: 0.0046445	total: 21.4s	remaining: 5.25s
212:	learn: 0.0046334	total: 21.6s	remaining: 5.16s
213:	learn: 0.0046032	total: 21.7s	remaining: 5.07s
214:	learn: 0.0045589	total: 21.8s	remaining: 4.98s
215:	learn: 0.0044911	total: 22s	remaining: 4.89s
216:	learn: 0.0044466	total: 22.1s	remaining: 4.8s
217:	learn: 0.0044153	total: 22.3s	remaining: 4.7s
218:	learn: 0.0043743	total: 22.5s	remaining: 4.62s
219:	learn: 0.0043498	total: 22.6s	remaining: 4.53s
220:	learn: 0.0043299	total: 22.8s	remaining: 4.43s
221:	learn: 0.0043161	total: 23s	remaining: 4.34s
222:	learn: 0.0042936	total: 23.1s	remaining: 4.25s
223:	learn: 0.0042936	total: 23.2s	remaining: 4.15s
224:	learn: 0.0042630	total: 23.4s	remaining: 4.05s
225:	learn: 0.0042492	total: 23.5s	remaining: 3.95s
226:	learn: 0.0042394	total: 23.6s	remaining: 3.85s
227:	learn: 0.0042084	total: 23.8s	remaining: 3.75s
228:	learn: 0.0041720	total: 23.9s	remaining: 3.66s
229:	learn: 0.0041720	total: 24.1s	remaining: 3.56s
230:	learn: 0.0041489	total: 24.2s	remaining: 3.46s
231:	learn: 0.0041260	total: 24.4s	remaining: 3.36s
232:	learn: 0.0041079	total: 24.5s	remaining: 3.27s
233:	learn: 0.0040922	total: 24.7s	remaining: 3.17s
234:	learn: 0.0040527	total: 24.9s	remaining: 3.07s
235:	learn: 0.0040248	total: 25s	remaining: 2.97s
236:	learn: 0.0039973	total: 25.2s	remaining: 2.87s
237:	learn: 0.0039952	total: 25.3s	remaining: 2.77s
238:	learn: 0.0039719	total: 25.5s	remaining: 2.67s
239:	learn: 0.0039484	total: 25.7s	remaining: 2.56s
240:	learn: 0.0039077	total: 25.8s	remaining: 2.46s
241:	learn: 0.0039077	total: 25.8s	remaining: 2.35s
242:	learn: 0.0039077	total: 25.9s	remaining: 2.24s
243:	learn: 0.0039077	total: 26s	remaining: 2.13s
244:	learn: 0.0038807	total: 26s	remaining: 2.02s
245:	learn: 0.0038807	total: 26.1s	remaining: 1.91s
246:	learn: 0.0038616	total: 26.2s	remaining: 1.8s
247:	learn: 0.0038208	total: 26.3s	remaining: 1.7s
248:	learn: 0.0037994	total: 26.4s	remaining: 1.59s
249:	learn: 0.0037886	total: 26.4s	remaining: 1.48s
250:	learn: 0.0037815	total: 26.5s	remaining: 1.37s
251:	learn: 0.0037815	total: 26.6s	remaining: 1.27s
252:	learn: 0.0037595	total: 26.7s	remaining: 1.16s
253:	learn: 0.0037427	total: 26.7s	remaining: 1.05s
254:	learn: 0.0037081	total: 26.8s	remaining: 947ms
255:	learn: 0.0036948	total: 26.9s	remaining: 841ms
256:	learn: 0.0036948	total: 27s	remaining: 735ms
257:	learn: 0.0036948	total: 27s	remaining: 629ms
258:	learn: 0.0036711	total: 27.1s	remaining: 524ms
259:	learn: 0.0036711	total: 27.2s	remaining: 419ms
260:	learn: 0.0036710	total: 27.3s	remaining: 314ms
261:	learn: 0.0036710	total: 27.3s	remaining: 209ms
262:	learn: 0.0036378	total: 27.4s	remaining: 104ms
263:	learn: 0.0036377	total: 27.5s	remaining: 0us
[I 2024-12-19 14:28:30,358] Trial 18 finished with value: 75.76072978196702 and parameters: {'learning_rate': 0.06850266651634628, 'max_depth': 5, 'n_estimators': 264, 'scale_pos_weight': 7.470440912112707}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.54
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.77
 - F1-Score_Train: 99.77
 - Precision_Test: 21.05
 - Recall_Test: 85.71
 - AUPRC_Test: 76.72
 - Accuracy_Test: 99.44
 - F1-Score_Test: 33.80
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 75.7607

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5020543	total: 84.5ms	remaining: 25.2s
1:	learn: 0.3591791	total: 158ms	remaining: 23.5s
2:	learn: 0.2568284	total: 234ms	remaining: 23.1s
3:	learn: 0.2186300	total: 324ms	remaining: 23.9s
4:	learn: 0.1627414	total: 407ms	remaining: 23.9s
5:	learn: 0.1286575	total: 507ms	remaining: 24.7s
6:	learn: 0.1116524	total: 594ms	remaining: 24.8s
7:	learn: 0.0902882	total: 682ms	remaining: 24.8s
8:	learn: 0.0810592	total: 759ms	remaining: 24.5s
9:	learn: 0.0729635	total: 852ms	remaining: 24.6s
10:	learn: 0.0662515	total: 924ms	remaining: 24.2s
11:	learn: 0.0588740	total: 1.02s	remaining: 24.3s
12:	learn: 0.0540743	total: 1.12s	remaining: 24.7s
13:	learn: 0.0501736	total: 1.2s	remaining: 24.5s
14:	learn: 0.0450168	total: 1.28s	remaining: 24.3s
15:	learn: 0.0424216	total: 1.38s	remaining: 24.4s
16:	learn: 0.0404768	total: 1.46s	remaining: 24.3s
17:	learn: 0.0384625	total: 1.56s	remaining: 24.3s
18:	learn: 0.0366859	total: 1.65s	remaining: 24.3s
19:	learn: 0.0345344	total: 1.73s	remaining: 24.1s
20:	learn: 0.0331833	total: 1.82s	remaining: 24.1s
21:	learn: 0.0321053	total: 1.91s	remaining: 24s
22:	learn: 0.0309747	total: 1.99s	remaining: 23.8s
23:	learn: 0.0296843	total: 2.07s	remaining: 23.7s
24:	learn: 0.0287814	total: 2.16s	remaining: 23.7s
25:	learn: 0.0276366	total: 2.24s	remaining: 23.5s
26:	learn: 0.0268136	total: 2.32s	remaining: 23.4s
27:	learn: 0.0257560	total: 2.42s	remaining: 23.4s
28:	learn: 0.0249286	total: 2.52s	remaining: 23.4s
29:	learn: 0.0242714	total: 2.6s	remaining: 23.3s
30:	learn: 0.0235289	total: 2.69s	remaining: 23.3s
31:	learn: 0.0226869	total: 2.78s	remaining: 23.2s
32:	learn: 0.0222675	total: 2.85s	remaining: 23s
33:	learn: 0.0217424	total: 2.94s	remaining: 22.9s
34:	learn: 0.0211242	total: 3.02s	remaining: 22.8s
35:	learn: 0.0205625	total: 3.1s	remaining: 22.7s
36:	learn: 0.0201141	total: 3.19s	remaining: 22.6s
37:	learn: 0.0197282	total: 3.27s	remaining: 22.5s
38:	learn: 0.0192881	total: 3.35s	remaining: 22.3s
39:	learn: 0.0187678	total: 3.44s	remaining: 22.3s
40:	learn: 0.0182160	total: 3.53s	remaining: 22.2s
41:	learn: 0.0179071	total: 3.62s	remaining: 22.1s
42:	learn: 0.0173722	total: 3.72s	remaining: 22.1s
43:	learn: 0.0170495	total: 3.79s	remaining: 22s
44:	learn: 0.0167523	total: 3.87s	remaining: 21.8s
45:	learn: 0.0162369	total: 3.97s	remaining: 21.8s
46:	learn: 0.0159186	total: 4.07s	remaining: 21.8s
47:	learn: 0.0156650	total: 4.15s	remaining: 21.7s
48:	learn: 0.0153447	total: 4.24s	remaining: 21.6s
49:	learn: 0.0149777	total: 4.32s	remaining: 21.5s
50:	learn: 0.0146632	total: 4.4s	remaining: 21.4s
51:	learn: 0.0144287	total: 4.5s	remaining: 21.4s
52:	learn: 0.0140572	total: 4.6s	remaining: 21.4s
53:	learn: 0.0136546	total: 4.69s	remaining: 21.3s
54:	learn: 0.0134366	total: 4.78s	remaining: 21.2s
55:	learn: 0.0132745	total: 4.86s	remaining: 21.1s
56:	learn: 0.0129455	total: 4.97s	remaining: 21.1s
57:	learn: 0.0127471	total: 5.12s	remaining: 21.3s
58:	learn: 0.0125125	total: 5.26s	remaining: 21.4s
59:	learn: 0.0122373	total: 5.42s	remaining: 21.6s
60:	learn: 0.0120269	total: 5.57s	remaining: 21.7s
61:	learn: 0.0119192	total: 5.74s	remaining: 21.9s
62:	learn: 0.0116784	total: 5.88s	remaining: 22s
63:	learn: 0.0115113	total: 6.03s	remaining: 22.1s
64:	learn: 0.0112810	total: 6.21s	remaining: 22.3s
65:	learn: 0.0111444	total: 6.38s	remaining: 22.5s
66:	learn: 0.0110105	total: 6.54s	remaining: 22.6s
67:	learn: 0.0107690	total: 6.71s	remaining: 22.8s
68:	learn: 0.0105228	total: 6.87s	remaining: 22.9s
69:	learn: 0.0103482	total: 7.02s	remaining: 23s
70:	learn: 0.0101762	total: 7.18s	remaining: 23.1s
71:	learn: 0.0100108	total: 7.37s	remaining: 23.2s
72:	learn: 0.0099223	total: 7.53s	remaining: 23.3s
73:	learn: 0.0097946	total: 7.71s	remaining: 23.5s
74:	learn: 0.0095882	total: 7.87s	remaining: 23.5s
75:	learn: 0.0094239	total: 8.04s	remaining: 23.6s
76:	learn: 0.0092169	total: 8.2s	remaining: 23.6s
77:	learn: 0.0090683	total: 8.38s	remaining: 23.7s
78:	learn: 0.0089207	total: 8.53s	remaining: 23.8s
79:	learn: 0.0088136	total: 8.71s	remaining: 23.8s
80:	learn: 0.0087040	total: 8.87s	remaining: 23.9s
81:	learn: 0.0085505	total: 9.07s	remaining: 24s
82:	learn: 0.0084340	total: 9.23s	remaining: 24s
83:	learn: 0.0083533	total: 9.39s	remaining: 24s
84:	learn: 0.0082290	total: 9.56s	remaining: 24.1s
85:	learn: 0.0081299	total: 9.75s	remaining: 24.2s
86:	learn: 0.0080471	total: 9.91s	remaining: 24.1s
87:	learn: 0.0079646	total: 10.1s	remaining: 24.2s
88:	learn: 0.0078405	total: 10.2s	remaining: 24.2s
89:	learn: 0.0077861	total: 10.4s	remaining: 24.2s
90:	learn: 0.0076286	total: 10.6s	remaining: 24.1s
91:	learn: 0.0074961	total: 10.7s	remaining: 24.1s
92:	learn: 0.0074139	total: 10.9s	remaining: 24.1s
93:	learn: 0.0073300	total: 11.1s	remaining: 24.1s
94:	learn: 0.0071863	total: 11.2s	remaining: 24.1s
95:	learn: 0.0071406	total: 11.4s	remaining: 24s
96:	learn: 0.0070564	total: 11.5s	remaining: 24s
97:	learn: 0.0069903	total: 11.7s	remaining: 24s
98:	learn: 0.0068963	total: 11.8s	remaining: 23.9s
99:	learn: 0.0068202	total: 12s	remaining: 23.9s
100:	learn: 0.0067078	total: 12.2s	remaining: 23.8s
101:	learn: 0.0066118	total: 12.3s	remaining: 23.8s
102:	learn: 0.0065198	total: 12.5s	remaining: 23.8s
103:	learn: 0.0064437	total: 12.7s	remaining: 23.7s
104:	learn: 0.0063676	total: 12.8s	remaining: 23.7s
105:	learn: 0.0062447	total: 13s	remaining: 23.7s
106:	learn: 0.0061590	total: 13.2s	remaining: 23.6s
107:	learn: 0.0060238	total: 13.3s	remaining: 23.6s
108:	learn: 0.0059546	total: 13.5s	remaining: 23.5s
109:	learn: 0.0058561	total: 13.7s	remaining: 23.5s
110:	learn: 0.0057815	total: 13.8s	remaining: 23.4s
111:	learn: 0.0056603	total: 14s	remaining: 23.3s
112:	learn: 0.0055552	total: 14.1s	remaining: 23.3s
113:	learn: 0.0054912	total: 14.3s	remaining: 23.2s
114:	learn: 0.0054274	total: 14.5s	remaining: 23.1s
115:	learn: 0.0053742	total: 14.6s	remaining: 23.1s
116:	learn: 0.0053217	total: 14.8s	remaining: 23s
117:	learn: 0.0052618	total: 14.9s	remaining: 22.9s
118:	learn: 0.0052243	total: 15.1s	remaining: 22.8s
119:	learn: 0.0051587	total: 15.2s	remaining: 22.6s
120:	learn: 0.0050764	total: 15.2s	remaining: 22.4s
121:	learn: 0.0050407	total: 15.3s	remaining: 22.2s
122:	learn: 0.0050033	total: 15.4s	remaining: 22.1s
123:	learn: 0.0049577	total: 15.5s	remaining: 21.9s
124:	learn: 0.0048610	total: 15.6s	remaining: 21.7s
125:	learn: 0.0048128	total: 15.7s	remaining: 21.5s
126:	learn: 0.0047568	total: 15.7s	remaining: 21.3s
127:	learn: 0.0047089	total: 15.8s	remaining: 21.1s
128:	learn: 0.0046742	total: 15.9s	remaining: 21s
129:	learn: 0.0046345	total: 16s	remaining: 20.8s
130:	learn: 0.0045867	total: 16.1s	remaining: 20.6s
131:	learn: 0.0045158	total: 16.2s	remaining: 20.5s
132:	learn: 0.0044788	total: 16.2s	remaining: 20.3s
133:	learn: 0.0044342	total: 16.3s	remaining: 20.1s
134:	learn: 0.0043816	total: 16.4s	remaining: 20s
135:	learn: 0.0043313	total: 16.5s	remaining: 19.8s
136:	learn: 0.0042781	total: 16.6s	remaining: 19.6s
137:	learn: 0.0042383	total: 16.7s	remaining: 19.5s
138:	learn: 0.0042038	total: 16.8s	remaining: 19.3s
139:	learn: 0.0041801	total: 16.8s	remaining: 19.1s
140:	learn: 0.0041458	total: 16.9s	remaining: 19s
141:	learn: 0.0041142	total: 17s	remaining: 18.8s
142:	learn: 0.0040797	total: 17.1s	remaining: 18.7s
143:	learn: 0.0040320	total: 17.2s	remaining: 18.5s
144:	learn: 0.0039865	total: 17.3s	remaining: 18.3s
145:	learn: 0.0039510	total: 17.3s	remaining: 18.2s
146:	learn: 0.0039082	total: 17.4s	remaining: 18s
147:	learn: 0.0038670	total: 17.5s	remaining: 17.9s
148:	learn: 0.0038130	total: 17.6s	remaining: 17.7s
149:	learn: 0.0037651	total: 17.7s	remaining: 17.6s
150:	learn: 0.0037309	total: 17.8s	remaining: 17.4s
151:	learn: 0.0037204	total: 17.8s	remaining: 17.3s
152:	learn: 0.0036855	total: 17.9s	remaining: 17.1s
153:	learn: 0.0036855	total: 18s	remaining: 16.9s
154:	learn: 0.0036855	total: 18.1s	remaining: 16.8s
155:	learn: 0.0036815	total: 18.1s	remaining: 16.6s
156:	learn: 0.0036527	total: 18.2s	remaining: 16.5s
157:	learn: 0.0036265	total: 18.3s	remaining: 16.3s
158:	learn: 0.0036265	total: 18.4s	remaining: 16.2s
159:	learn: 0.0035821	total: 18.4s	remaining: 16s
160:	learn: 0.0035675	total: 18.5s	remaining: 15.9s
161:	learn: 0.0035306	total: 18.6s	remaining: 15.7s
162:	learn: 0.0034745	total: 18.7s	remaining: 15.6s
163:	learn: 0.0034419	total: 18.8s	remaining: 15.5s
164:	learn: 0.0033993	total: 18.9s	remaining: 15.3s
165:	learn: 0.0033932	total: 18.9s	remaining: 15.2s
166:	learn: 0.0033731	total: 19s	remaining: 15s
167:	learn: 0.0033731	total: 19.1s	remaining: 14.9s
168:	learn: 0.0033234	total: 19.2s	remaining: 14.8s
169:	learn: 0.0032959	total: 19.3s	remaining: 14.6s
170:	learn: 0.0032846	total: 19.3s	remaining: 14.5s
171:	learn: 0.0032846	total: 19.4s	remaining: 14.3s
172:	learn: 0.0032573	total: 19.5s	remaining: 14.2s
173:	learn: 0.0032573	total: 19.6s	remaining: 14.1s
174:	learn: 0.0032573	total: 19.6s	remaining: 13.9s
175:	learn: 0.0032541	total: 19.7s	remaining: 13.8s
176:	learn: 0.0032310	total: 19.8s	remaining: 13.6s
177:	learn: 0.0032129	total: 19.9s	remaining: 13.5s
178:	learn: 0.0032129	total: 20s	remaining: 13.4s
179:	learn: 0.0032129	total: 20s	remaining: 13.2s
180:	learn: 0.0032129	total: 20.1s	remaining: 13.1s
181:	learn: 0.0032129	total: 20.2s	remaining: 13s
182:	learn: 0.0032023	total: 20.2s	remaining: 12.8s
183:	learn: 0.0031680	total: 20.3s	remaining: 12.7s
184:	learn: 0.0031680	total: 20.4s	remaining: 12.6s
185:	learn: 0.0031680	total: 20.5s	remaining: 12.4s
186:	learn: 0.0031680	total: 20.5s	remaining: 12.3s
187:	learn: 0.0031680	total: 20.6s	remaining: 12.2s
188:	learn: 0.0031680	total: 20.7s	remaining: 12s
189:	learn: 0.0031680	total: 20.7s	remaining: 11.9s
190:	learn: 0.0031680	total: 20.8s	remaining: 11.8s
191:	learn: 0.0031680	total: 20.9s	remaining: 11.6s
192:	learn: 0.0031680	total: 21s	remaining: 11.5s
193:	learn: 0.0031680	total: 21s	remaining: 11.4s
194:	learn: 0.0031680	total: 21.1s	remaining: 11.2s
195:	learn: 0.0031680	total: 21.2s	remaining: 11.1s
196:	learn: 0.0031680	total: 21.2s	remaining: 11s
197:	learn: 0.0031680	total: 21.3s	remaining: 10.9s
198:	learn: 0.0031680	total: 21.4s	remaining: 10.7s
199:	learn: 0.0031396	total: 21.4s	remaining: 10.6s
200:	learn: 0.0031213	total: 21.5s	remaining: 10.5s
201:	learn: 0.0031213	total: 21.6s	remaining: 10.4s
202:	learn: 0.0031213	total: 21.7s	remaining: 10.3s
203:	learn: 0.0031213	total: 21.7s	remaining: 10.1s
204:	learn: 0.0031213	total: 21.8s	remaining: 10s
205:	learn: 0.0031213	total: 21.9s	remaining: 9.88s
206:	learn: 0.0031212	total: 22s	remaining: 9.76s
207:	learn: 0.0031213	total: 22s	remaining: 9.63s
208:	learn: 0.0031213	total: 22.1s	remaining: 9.51s
209:	learn: 0.0031212	total: 22.1s	remaining: 9.38s
210:	learn: 0.0031212	total: 22.2s	remaining: 9.27s
211:	learn: 0.0031212	total: 22.3s	remaining: 9.15s
212:	learn: 0.0031212	total: 22.4s	remaining: 9.03s
213:	learn: 0.0031212	total: 22.4s	remaining: 8.9s
214:	learn: 0.0031212	total: 22.5s	remaining: 8.79s
215:	learn: 0.0031212	total: 22.6s	remaining: 8.67s
216:	learn: 0.0031212	total: 22.6s	remaining: 8.55s
217:	learn: 0.0031212	total: 22.7s	remaining: 8.44s
218:	learn: 0.0031212	total: 22.8s	remaining: 8.32s
219:	learn: 0.0031212	total: 22.8s	remaining: 8.2s
220:	learn: 0.0031212	total: 22.9s	remaining: 8.09s
221:	learn: 0.0031212	total: 23s	remaining: 7.97s
222:	learn: 0.0031212	total: 23.1s	remaining: 7.86s
223:	learn: 0.0031212	total: 23.1s	remaining: 7.74s
224:	learn: 0.0031212	total: 23.2s	remaining: 7.63s
225:	learn: 0.0031212	total: 23.3s	remaining: 7.51s
226:	learn: 0.0031212	total: 23.3s	remaining: 7.4s
227:	learn: 0.0031212	total: 23.4s	remaining: 7.29s
228:	learn: 0.0031212	total: 23.5s	remaining: 7.17s
229:	learn: 0.0031212	total: 23.5s	remaining: 7.06s
230:	learn: 0.0031212	total: 23.6s	remaining: 6.95s
231:	learn: 0.0031212	total: 23.7s	remaining: 6.84s
232:	learn: 0.0031212	total: 23.7s	remaining: 6.72s
233:	learn: 0.0031212	total: 23.8s	remaining: 6.61s
234:	learn: 0.0031212	total: 23.9s	remaining: 6.5s
235:	learn: 0.0031212	total: 24s	remaining: 6.39s
236:	learn: 0.0031212	total: 24s	remaining: 6.28s
237:	learn: 0.0031212	total: 24.1s	remaining: 6.17s
238:	learn: 0.0031212	total: 24.2s	remaining: 6.06s
239:	learn: 0.0031212	total: 24.2s	remaining: 5.95s
240:	learn: 0.0031212	total: 24.3s	remaining: 5.85s
241:	learn: 0.0031212	total: 24.4s	remaining: 5.74s
242:	learn: 0.0031212	total: 24.4s	remaining: 5.63s
243:	learn: 0.0031212	total: 24.5s	remaining: 5.52s
244:	learn: 0.0031212	total: 24.6s	remaining: 5.41s
245:	learn: 0.0031212	total: 24.6s	remaining: 5.3s
246:	learn: 0.0031212	total: 24.7s	remaining: 5.2s
247:	learn: 0.0031212	total: 24.8s	remaining: 5.09s
248:	learn: 0.0031212	total: 24.8s	remaining: 4.99s
249:	learn: 0.0031212	total: 24.9s	remaining: 4.88s
250:	learn: 0.0031212	total: 25s	remaining: 4.78s
251:	learn: 0.0031212	total: 25.1s	remaining: 4.68s
252:	learn: 0.0031212	total: 25.2s	remaining: 4.58s
253:	learn: 0.0031212	total: 25.3s	remaining: 4.48s
254:	learn: 0.0031212	total: 25.4s	remaining: 4.38s
255:	learn: 0.0031212	total: 25.5s	remaining: 4.29s
256:	learn: 0.0031212	total: 25.7s	remaining: 4.2s
257:	learn: 0.0031212	total: 25.8s	remaining: 4.1s
258:	learn: 0.0031212	total: 26s	remaining: 4.01s
259:	learn: 0.0031212	total: 26.1s	remaining: 3.91s
260:	learn: 0.0031212	total: 26.2s	remaining: 3.82s
261:	learn: 0.0031212	total: 26.3s	remaining: 3.72s
262:	learn: 0.0031212	total: 26.5s	remaining: 3.62s
263:	learn: 0.0031212	total: 26.6s	remaining: 3.53s
264:	learn: 0.0031212	total: 26.7s	remaining: 3.43s
265:	learn: 0.0031212	total: 26.9s	remaining: 3.33s
266:	learn: 0.0031212	total: 27s	remaining: 3.23s
267:	learn: 0.0031212	total: 27.1s	remaining: 3.14s
268:	learn: 0.0031212	total: 27.3s	remaining: 3.04s
269:	learn: 0.0031212	total: 27.4s	remaining: 2.94s
270:	learn: 0.0031212	total: 27.5s	remaining: 2.84s
271:	learn: 0.0031212	total: 27.6s	remaining: 2.74s
272:	learn: 0.0031212	total: 27.8s	remaining: 2.65s
273:	learn: 0.0031212	total: 27.9s	remaining: 2.55s
274:	learn: 0.0031212	total: 28s	remaining: 2.45s
275:	learn: 0.0031212	total: 28.2s	remaining: 2.35s
276:	learn: 0.0031212	total: 28.3s	remaining: 2.25s
277:	learn: 0.0031212	total: 28.4s	remaining: 2.15s
278:	learn: 0.0031212	total: 28.5s	remaining: 2.04s
279:	learn: 0.0031212	total: 28.7s	remaining: 1.94s
280:	learn: 0.0031212	total: 28.8s	remaining: 1.84s
281:	learn: 0.0031212	total: 28.9s	remaining: 1.74s
282:	learn: 0.0031212	total: 29s	remaining: 1.64s
283:	learn: 0.0031212	total: 29.1s	remaining: 1.54s
284:	learn: 0.0031212	total: 29.3s	remaining: 1.44s
285:	learn: 0.0031212	total: 29.4s	remaining: 1.34s
286:	learn: 0.0031212	total: 29.5s	remaining: 1.23s
287:	learn: 0.0031212	total: 29.6s	remaining: 1.13s
288:	learn: 0.0031172	total: 29.8s	remaining: 1.03s
289:	learn: 0.0030902	total: 29.9s	remaining: 929ms
290:	learn: 0.0030743	total: 30.1s	remaining: 827ms
291:	learn: 0.0030372	total: 30.2s	remaining: 725ms
292:	learn: 0.0030372	total: 30.4s	remaining: 622ms
293:	learn: 0.0030372	total: 30.5s	remaining: 519ms
294:	learn: 0.0030372	total: 30.6s	remaining: 415ms
295:	learn: 0.0030372	total: 30.7s	remaining: 311ms
296:	learn: 0.0030372	total: 30.8s	remaining: 207ms
297:	learn: 0.0030372	total: 30.8s	remaining: 103ms
298:	learn: 0.0030372	total: 30.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.34
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.67
 - F1-Score_Train: 99.67
 - Precision_Test: 16.25
 - Recall_Test: 87.30
 - AUPRC_Test: 77.75
 - Accuracy_Test: 99.22
 - F1-Score_Test: 27.40
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 299
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.26
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4924829	total: 78.9ms	remaining: 23.5s
1:	learn: 0.3594689	total: 157ms	remaining: 23.2s
2:	learn: 0.2927070	total: 233ms	remaining: 23s
3:	learn: 0.2238305	total: 331ms	remaining: 24.4s
4:	learn: 0.1891208	total: 402ms	remaining: 23.7s
5:	learn: 0.1578421	total: 477ms	remaining: 23.3s
6:	learn: 0.1361190	total: 564ms	remaining: 23.5s
7:	learn: 0.1173186	total: 650ms	remaining: 23.6s
8:	learn: 0.1030176	total: 729ms	remaining: 23.5s
9:	learn: 0.0965088	total: 822ms	remaining: 23.8s
10:	learn: 0.0886856	total: 901ms	remaining: 23.6s
11:	learn: 0.0818489	total: 985ms	remaining: 23.6s
12:	learn: 0.0765352	total: 1.08s	remaining: 23.8s
13:	learn: 0.0726992	total: 1.15s	remaining: 23.4s
14:	learn: 0.0692857	total: 1.22s	remaining: 23.1s
15:	learn: 0.0653383	total: 1.31s	remaining: 23.2s
16:	learn: 0.0626081	total: 1.39s	remaining: 23.1s
17:	learn: 0.0600392	total: 1.48s	remaining: 23.1s
18:	learn: 0.0568610	total: 1.57s	remaining: 23.2s
19:	learn: 0.0551276	total: 1.65s	remaining: 23s
20:	learn: 0.0530520	total: 1.72s	remaining: 22.8s
21:	learn: 0.0499546	total: 1.83s	remaining: 23.1s
22:	learn: 0.0486563	total: 1.92s	remaining: 23s
23:	learn: 0.0477189	total: 2s	remaining: 23s
24:	learn: 0.0457889	total: 2.1s	remaining: 23s
25:	learn: 0.0440794	total: 2.18s	remaining: 22.9s
26:	learn: 0.0429452	total: 2.26s	remaining: 22.7s
27:	learn: 0.0410290	total: 2.35s	remaining: 22.8s
28:	learn: 0.0399311	total: 2.43s	remaining: 22.6s
29:	learn: 0.0383530	total: 2.51s	remaining: 22.5s
30:	learn: 0.0374909	total: 2.6s	remaining: 22.5s
31:	learn: 0.0365148	total: 2.68s	remaining: 22.4s
32:	learn: 0.0356412	total: 2.76s	remaining: 22.3s
33:	learn: 0.0351415	total: 2.85s	remaining: 22.2s
34:	learn: 0.0342974	total: 2.92s	remaining: 22.1s
35:	learn: 0.0335099	total: 3.02s	remaining: 22s
36:	learn: 0.0326584	total: 3.11s	remaining: 22s
37:	learn: 0.0317151	total: 3.19s	remaining: 21.9s
38:	learn: 0.0306087	total: 3.26s	remaining: 21.7s
39:	learn: 0.0298369	total: 3.36s	remaining: 21.8s
40:	learn: 0.0291752	total: 3.44s	remaining: 21.7s
41:	learn: 0.0281889	total: 3.52s	remaining: 21.6s
42:	learn: 0.0274742	total: 3.62s	remaining: 21.5s
43:	learn: 0.0267768	total: 3.69s	remaining: 21.4s
44:	learn: 0.0263399	total: 3.78s	remaining: 21.4s
45:	learn: 0.0255983	total: 3.89s	remaining: 21.4s
46:	learn: 0.0250700	total: 3.97s	remaining: 21.3s
47:	learn: 0.0245421	total: 4.05s	remaining: 21.2s
48:	learn: 0.0239065	total: 4.15s	remaining: 21.2s
49:	learn: 0.0235144	total: 4.22s	remaining: 21s
50:	learn: 0.0230169	total: 4.3s	remaining: 20.9s
51:	learn: 0.0225032	total: 4.4s	remaining: 20.9s
52:	learn: 0.0218878	total: 4.48s	remaining: 20.8s
53:	learn: 0.0214663	total: 4.56s	remaining: 20.7s
54:	learn: 0.0211378	total: 4.65s	remaining: 20.6s
55:	learn: 0.0208792	total: 4.74s	remaining: 20.6s
56:	learn: 0.0206296	total: 4.81s	remaining: 20.4s
57:	learn: 0.0202615	total: 4.9s	remaining: 20.4s
58:	learn: 0.0199803	total: 4.98s	remaining: 20.2s
59:	learn: 0.0197631	total: 5.05s	remaining: 20.1s
60:	learn: 0.0191843	total: 5.16s	remaining: 20.1s
61:	learn: 0.0186990	total: 5.23s	remaining: 20s
62:	learn: 0.0183966	total: 5.31s	remaining: 19.9s
63:	learn: 0.0178734	total: 5.41s	remaining: 19.9s
64:	learn: 0.0175045	total: 5.49s	remaining: 19.8s
65:	learn: 0.0171615	total: 5.57s	remaining: 19.7s
66:	learn: 0.0168715	total: 5.66s	remaining: 19.6s
67:	learn: 0.0165597	total: 5.75s	remaining: 19.5s
68:	learn: 0.0163413	total: 5.83s	remaining: 19.4s
69:	learn: 0.0158843	total: 5.93s	remaining: 19.4s
70:	learn: 0.0156840	total: 6s	remaining: 19.3s
71:	learn: 0.0153463	total: 6.08s	remaining: 19.2s
72:	learn: 0.0150760	total: 6.19s	remaining: 19.2s
73:	learn: 0.0148222	total: 6.27s	remaining: 19.1s
74:	learn: 0.0144949	total: 6.34s	remaining: 18.9s
75:	learn: 0.0143116	total: 6.45s	remaining: 18.9s
76:	learn: 0.0141139	total: 6.53s	remaining: 18.8s
77:	learn: 0.0138857	total: 6.61s	remaining: 18.7s
78:	learn: 0.0136600	total: 6.7s	remaining: 18.6s
79:	learn: 0.0134426	total: 6.77s	remaining: 18.5s
80:	learn: 0.0132094	total: 6.85s	remaining: 18.4s
81:	learn: 0.0129112	total: 6.95s	remaining: 18.4s
82:	learn: 0.0128173	total: 7.01s	remaining: 18.3s
83:	learn: 0.0125356	total: 7.1s	remaining: 18.2s
84:	learn: 0.0123889	total: 7.21s	remaining: 18.1s
85:	learn: 0.0122803	total: 7.29s	remaining: 18s
86:	learn: 0.0121637	total: 7.36s	remaining: 17.9s
87:	learn: 0.0118751	total: 7.48s	remaining: 17.9s
88:	learn: 0.0116404	total: 7.61s	remaining: 18s
89:	learn: 0.0115565	total: 7.76s	remaining: 18s
90:	learn: 0.0113140	total: 7.91s	remaining: 18.1s
91:	learn: 0.0111968	total: 8.06s	remaining: 18.1s
92:	learn: 0.0110321	total: 8.21s	remaining: 18.2s
93:	learn: 0.0108415	total: 8.37s	remaining: 18.3s
94:	learn: 0.0106734	total: 8.53s	remaining: 18.3s
95:	learn: 0.0105595	total: 8.68s	remaining: 18.3s
96:	learn: 0.0104217	total: 8.82s	remaining: 18.4s
97:	learn: 0.0102562	total: 8.98s	remaining: 18.4s
98:	learn: 0.0101107	total: 9.12s	remaining: 18.4s
99:	learn: 0.0100245	total: 9.3s	remaining: 18.5s
100:	learn: 0.0099146	total: 9.43s	remaining: 18.5s
101:	learn: 0.0098195	total: 9.58s	remaining: 18.5s
102:	learn: 0.0096839	total: 9.68s	remaining: 18.4s
103:	learn: 0.0095358	total: 9.84s	remaining: 18.4s
104:	learn: 0.0094605	total: 10s	remaining: 18.5s
105:	learn: 0.0093388	total: 10.2s	remaining: 18.5s
106:	learn: 0.0091450	total: 10.3s	remaining: 18.5s
107:	learn: 0.0089783	total: 10.5s	remaining: 18.6s
108:	learn: 0.0089328	total: 10.6s	remaining: 18.6s
109:	learn: 0.0087895	total: 10.8s	remaining: 18.6s
110:	learn: 0.0087308	total: 11s	remaining: 18.6s
111:	learn: 0.0086215	total: 11.1s	remaining: 18.5s
112:	learn: 0.0085094	total: 11.3s	remaining: 18.5s
113:	learn: 0.0083779	total: 11.4s	remaining: 18.5s
114:	learn: 0.0083034	total: 11.6s	remaining: 18.5s
115:	learn: 0.0081953	total: 11.7s	remaining: 18.5s
116:	learn: 0.0080882	total: 11.9s	remaining: 18.5s
117:	learn: 0.0079785	total: 12s	remaining: 18.5s
118:	learn: 0.0079440	total: 12.2s	remaining: 18.4s
119:	learn: 0.0077696	total: 12.3s	remaining: 18.4s
120:	learn: 0.0076995	total: 12.5s	remaining: 18.3s
121:	learn: 0.0075775	total: 12.6s	remaining: 18.3s
122:	learn: 0.0074785	total: 12.8s	remaining: 18.3s
123:	learn: 0.0073525	total: 12.9s	remaining: 18.2s
124:	learn: 0.0072799	total: 13s	remaining: 18s
125:	learn: 0.0072432	total: 13s	remaining: 17.9s
126:	learn: 0.0071378	total: 13.1s	remaining: 17.8s
127:	learn: 0.0070078	total: 13.2s	remaining: 17.6s
128:	learn: 0.0069151	total: 13.3s	remaining: 17.5s
129:	learn: 0.0068684	total: 13.4s	remaining: 17.4s
130:	learn: 0.0068584	total: 13.4s	remaining: 17.2s
131:	learn: 0.0067785	total: 13.5s	remaining: 17.1s
132:	learn: 0.0066868	total: 13.6s	remaining: 17s
133:	learn: 0.0066448	total: 13.7s	remaining: 16.9s
134:	learn: 0.0066141	total: 13.8s	remaining: 16.7s
135:	learn: 0.0065299	total: 13.9s	remaining: 16.6s
136:	learn: 0.0064260	total: 13.9s	remaining: 16.5s
137:	learn: 0.0063400	total: 14s	remaining: 16.4s
138:	learn: 0.0063250	total: 14.1s	remaining: 16.2s
139:	learn: 0.0062033	total: 14.2s	remaining: 16.1s
140:	learn: 0.0061244	total: 14.3s	remaining: 16s
141:	learn: 0.0060711	total: 14.4s	remaining: 15.9s
142:	learn: 0.0060092	total: 14.4s	remaining: 15.8s
143:	learn: 0.0060090	total: 14.5s	remaining: 15.6s
144:	learn: 0.0059042	total: 14.6s	remaining: 15.5s
145:	learn: 0.0058651	total: 14.7s	remaining: 15.4s
146:	learn: 0.0058438	total: 14.8s	remaining: 15.3s
147:	learn: 0.0057832	total: 14.9s	remaining: 15.2s
148:	learn: 0.0057641	total: 14.9s	remaining: 15s
149:	learn: 0.0057154	total: 15s	remaining: 14.9s
150:	learn: 0.0056777	total: 15.1s	remaining: 14.8s
151:	learn: 0.0056434	total: 15.2s	remaining: 14.7s
152:	learn: 0.0056093	total: 15.2s	remaining: 14.6s
153:	learn: 0.0055478	total: 15.3s	remaining: 14.5s
154:	learn: 0.0054948	total: 15.4s	remaining: 14.3s
155:	learn: 0.0054658	total: 15.5s	remaining: 14.2s
156:	learn: 0.0054447	total: 15.6s	remaining: 14.1s
157:	learn: 0.0053605	total: 15.7s	remaining: 14s
158:	learn: 0.0053012	total: 15.8s	remaining: 13.9s
159:	learn: 0.0052767	total: 15.9s	remaining: 13.8s
160:	learn: 0.0052417	total: 15.9s	remaining: 13.6s
161:	learn: 0.0052416	total: 16s	remaining: 13.5s
162:	learn: 0.0052093	total: 16.1s	remaining: 13.4s
163:	learn: 0.0051706	total: 16.1s	remaining: 13.3s
164:	learn: 0.0051163	total: 16.2s	remaining: 13.2s
165:	learn: 0.0050758	total: 16.3s	remaining: 13.1s
166:	learn: 0.0050239	total: 16.4s	remaining: 13s
167:	learn: 0.0050068	total: 16.5s	remaining: 12.8s
168:	learn: 0.0049787	total: 16.6s	remaining: 12.7s
169:	learn: 0.0049528	total: 16.7s	remaining: 12.7s
170:	learn: 0.0049065	total: 16.8s	remaining: 12.6s
171:	learn: 0.0048954	total: 16.9s	remaining: 12.4s
172:	learn: 0.0048797	total: 16.9s	remaining: 12.3s
173:	learn: 0.0048490	total: 17s	remaining: 12.2s
174:	learn: 0.0048350	total: 17.1s	remaining: 12.1s
175:	learn: 0.0047664	total: 17.2s	remaining: 12s
176:	learn: 0.0047160	total: 17.3s	remaining: 11.9s
177:	learn: 0.0046495	total: 17.3s	remaining: 11.8s
178:	learn: 0.0046330	total: 17.4s	remaining: 11.7s
179:	learn: 0.0046045	total: 17.5s	remaining: 11.6s
180:	learn: 0.0045902	total: 17.6s	remaining: 11.5s
181:	learn: 0.0045653	total: 17.7s	remaining: 11.4s
182:	learn: 0.0045106	total: 17.8s	remaining: 11.3s
183:	learn: 0.0044915	total: 17.8s	remaining: 11.1s
184:	learn: 0.0044803	total: 17.9s	remaining: 11s
185:	learn: 0.0044278	total: 18s	remaining: 10.9s
186:	learn: 0.0043878	total: 18.1s	remaining: 10.8s
187:	learn: 0.0043318	total: 18.2s	remaining: 10.7s
188:	learn: 0.0042982	total: 18.3s	remaining: 10.6s
189:	learn: 0.0042674	total: 18.3s	remaining: 10.5s
190:	learn: 0.0042185	total: 18.4s	remaining: 10.4s
191:	learn: 0.0042089	total: 18.5s	remaining: 10.3s
192:	learn: 0.0041890	total: 18.6s	remaining: 10.2s
193:	learn: 0.0041508	total: 18.7s	remaining: 10.1s
194:	learn: 0.0041255	total: 18.8s	remaining: 10s
195:	learn: 0.0040906	total: 18.9s	remaining: 9.92s
196:	learn: 0.0040465	total: 18.9s	remaining: 9.81s
197:	learn: 0.0040323	total: 19s	remaining: 9.71s
198:	learn: 0.0040221	total: 19.1s	remaining: 9.6s
199:	learn: 0.0039673	total: 19.2s	remaining: 9.49s
200:	learn: 0.0039417	total: 19.3s	remaining: 9.4s
201:	learn: 0.0039416	total: 19.3s	remaining: 9.28s
202:	learn: 0.0039417	total: 19.4s	remaining: 9.17s
203:	learn: 0.0039366	total: 19.5s	remaining: 9.06s
204:	learn: 0.0039365	total: 19.5s	remaining: 8.96s
205:	learn: 0.0039177	total: 19.6s	remaining: 8.85s
206:	learn: 0.0038782	total: 19.7s	remaining: 8.75s
207:	learn: 0.0038364	total: 19.8s	remaining: 8.66s
208:	learn: 0.0037721	total: 19.9s	remaining: 8.56s
209:	learn: 0.0037668	total: 20s	remaining: 8.46s
210:	learn: 0.0037667	total: 20s	remaining: 8.35s
211:	learn: 0.0037668	total: 20.1s	remaining: 8.25s
212:	learn: 0.0037507	total: 20.2s	remaining: 8.14s
213:	learn: 0.0037450	total: 20.2s	remaining: 8.04s
214:	learn: 0.0037241	total: 20.3s	remaining: 7.94s
215:	learn: 0.0036932	total: 20.4s	remaining: 7.84s
216:	learn: 0.0036633	total: 20.5s	remaining: 7.75s
217:	learn: 0.0036632	total: 20.6s	remaining: 7.64s
218:	learn: 0.0036632	total: 20.6s	remaining: 7.53s
219:	learn: 0.0036632	total: 20.7s	remaining: 7.42s
220:	learn: 0.0036631	total: 20.8s	remaining: 7.33s
221:	learn: 0.0036313	total: 20.9s	remaining: 7.23s
222:	learn: 0.0035968	total: 20.9s	remaining: 7.13s
223:	learn: 0.0035539	total: 21s	remaining: 7.04s
224:	learn: 0.0035411	total: 21.1s	remaining: 6.94s
225:	learn: 0.0035202	total: 21.2s	remaining: 6.84s
226:	learn: 0.0034888	total: 21.3s	remaining: 6.75s
227:	learn: 0.0034888	total: 21.3s	remaining: 6.64s
228:	learn: 0.0034490	total: 21.4s	remaining: 6.54s
229:	learn: 0.0034331	total: 21.5s	remaining: 6.45s
230:	learn: 0.0034330	total: 21.6s	remaining: 6.34s
231:	learn: 0.0034330	total: 21.6s	remaining: 6.24s
232:	learn: 0.0034329	total: 21.7s	remaining: 6.14s
233:	learn: 0.0034329	total: 21.8s	remaining: 6.04s
234:	learn: 0.0034328	total: 21.8s	remaining: 5.95s
235:	learn: 0.0034329	total: 21.9s	remaining: 5.84s
236:	learn: 0.0034329	total: 21.9s	remaining: 5.74s
237:	learn: 0.0034293	total: 22s	remaining: 5.65s
238:	learn: 0.0034293	total: 22.1s	remaining: 5.54s
239:	learn: 0.0034222	total: 22.2s	remaining: 5.45s
240:	learn: 0.0034221	total: 22.2s	remaining: 5.35s
241:	learn: 0.0034023	total: 22.3s	remaining: 5.25s
242:	learn: 0.0034023	total: 22.4s	remaining: 5.16s
243:	learn: 0.0033907	total: 22.4s	remaining: 5.06s
244:	learn: 0.0033907	total: 22.5s	remaining: 4.96s
245:	learn: 0.0033907	total: 22.6s	remaining: 4.86s
246:	learn: 0.0033657	total: 22.6s	remaining: 4.77s
247:	learn: 0.0033657	total: 22.7s	remaining: 4.67s
248:	learn: 0.0033657	total: 22.8s	remaining: 4.58s
249:	learn: 0.0033657	total: 22.9s	remaining: 4.49s
250:	learn: 0.0033656	total: 23s	remaining: 4.4s
251:	learn: 0.0033657	total: 23.1s	remaining: 4.32s
252:	learn: 0.0033656	total: 23.3s	remaining: 4.23s
253:	learn: 0.0033657	total: 23.4s	remaining: 4.14s
254:	learn: 0.0033257	total: 23.6s	remaining: 4.06s
255:	learn: 0.0033257	total: 23.7s	remaining: 3.98s
256:	learn: 0.0033256	total: 23.8s	remaining: 3.89s
257:	learn: 0.0033256	total: 24s	remaining: 3.81s
258:	learn: 0.0033228	total: 24.1s	remaining: 3.72s
259:	learn: 0.0033228	total: 24.2s	remaining: 3.63s
260:	learn: 0.0033228	total: 24.4s	remaining: 3.55s
261:	learn: 0.0033228	total: 24.5s	remaining: 3.46s
262:	learn: 0.0033228	total: 24.6s	remaining: 3.37s
263:	learn: 0.0033228	total: 24.7s	remaining: 3.28s
264:	learn: 0.0033228	total: 24.9s	remaining: 3.19s
265:	learn: 0.0033228	total: 25s	remaining: 3.1s
266:	learn: 0.0033227	total: 25.1s	remaining: 3.01s
267:	learn: 0.0033227	total: 25.3s	remaining: 2.92s
268:	learn: 0.0033227	total: 25.4s	remaining: 2.83s
269:	learn: 0.0033227	total: 25.5s	remaining: 2.74s
270:	learn: 0.0033227	total: 25.7s	remaining: 2.65s
271:	learn: 0.0033227	total: 25.8s	remaining: 2.56s
272:	learn: 0.0033227	total: 25.9s	remaining: 2.47s
273:	learn: 0.0033226	total: 26.1s	remaining: 2.38s
274:	learn: 0.0033226	total: 26.2s	remaining: 2.29s
275:	learn: 0.0033227	total: 26.3s	remaining: 2.19s
276:	learn: 0.0033226	total: 26.5s	remaining: 2.1s
277:	learn: 0.0033226	total: 26.6s	remaining: 2.01s
278:	learn: 0.0033226	total: 26.7s	remaining: 1.91s
279:	learn: 0.0033226	total: 26.8s	remaining: 1.82s
280:	learn: 0.0033226	total: 26.9s	remaining: 1.73s
281:	learn: 0.0033226	total: 27.1s	remaining: 1.63s
282:	learn: 0.0033226	total: 27.2s	remaining: 1.54s
283:	learn: 0.0033226	total: 27.3s	remaining: 1.44s
284:	learn: 0.0033226	total: 27.5s	remaining: 1.35s
285:	learn: 0.0033226	total: 27.6s	remaining: 1.25s
286:	learn: 0.0033226	total: 27.7s	remaining: 1.16s
287:	learn: 0.0033225	total: 27.8s	remaining: 1.06s
288:	learn: 0.0033225	total: 28s	remaining: 969ms
289:	learn: 0.0033225	total: 28.1s	remaining: 872ms
290:	learn: 0.0033224	total: 28.2s	remaining: 776ms
291:	learn: 0.0033224	total: 28.4s	remaining: 680ms
292:	learn: 0.0033224	total: 28.4s	remaining: 582ms
293:	learn: 0.0033224	total: 28.5s	remaining: 485ms
294:	learn: 0.0033224	total: 28.6s	remaining: 388ms
295:	learn: 0.0033224	total: 28.7s	remaining: 290ms
296:	learn: 0.0033224	total: 28.7s	remaining: 193ms
297:	learn: 0.0033224	total: 28.8s	remaining: 96.6ms
298:	learn: 0.0033197	total: 28.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.40
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.70
 - F1-Score_Train: 99.70
 - Precision_Test: 17.75
 - Recall_Test: 88.89
 - AUPRC_Test: 74.52
 - Accuracy_Test: 99.29
 - F1-Score_Test: 29.59
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 299
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.26
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5013642	total: 74.1ms	remaining: 22.1s
1:	learn: 0.3453013	total: 151ms	remaining: 22.5s
2:	learn: 0.2422884	total: 235ms	remaining: 23.2s
3:	learn: 0.1849793	total: 338ms	remaining: 25s
4:	learn: 0.1562798	total: 409ms	remaining: 24s
5:	learn: 0.1344411	total: 484ms	remaining: 23.6s
6:	learn: 0.1150456	total: 572ms	remaining: 23.8s
7:	learn: 0.1021640	total: 657ms	remaining: 23.9s
8:	learn: 0.0911030	total: 732ms	remaining: 23.6s
9:	learn: 0.0828552	total: 834ms	remaining: 24.1s
10:	learn: 0.0760811	total: 903ms	remaining: 23.6s
11:	learn: 0.0691742	total: 977ms	remaining: 23.4s
12:	learn: 0.0648355	total: 1.07s	remaining: 23.6s
13:	learn: 0.0617923	total: 1.14s	remaining: 23.2s
14:	learn: 0.0580566	total: 1.22s	remaining: 23.1s
15:	learn: 0.0557545	total: 1.31s	remaining: 23.2s
16:	learn: 0.0529547	total: 1.41s	remaining: 23.4s
17:	learn: 0.0498984	total: 1.48s	remaining: 23.1s
18:	learn: 0.0479853	total: 1.59s	remaining: 23.4s
19:	learn: 0.0456245	total: 1.67s	remaining: 23.3s
20:	learn: 0.0441055	total: 1.74s	remaining: 23.1s
21:	learn: 0.0423531	total: 1.84s	remaining: 23.2s
22:	learn: 0.0408178	total: 1.92s	remaining: 23s
23:	learn: 0.0395107	total: 2s	remaining: 22.9s
24:	learn: 0.0380282	total: 2.09s	remaining: 22.9s
25:	learn: 0.0369986	total: 2.17s	remaining: 22.8s
26:	learn: 0.0357589	total: 2.25s	remaining: 22.7s
27:	learn: 0.0347539	total: 2.35s	remaining: 22.7s
28:	learn: 0.0340317	total: 2.44s	remaining: 22.7s
29:	learn: 0.0328097	total: 2.52s	remaining: 22.6s
30:	learn: 0.0316196	total: 2.62s	remaining: 22.7s
31:	learn: 0.0307990	total: 2.69s	remaining: 22.5s
32:	learn: 0.0297235	total: 2.78s	remaining: 22.4s
33:	learn: 0.0286361	total: 2.88s	remaining: 22.4s
34:	learn: 0.0281614	total: 2.95s	remaining: 22.3s
35:	learn: 0.0275694	total: 3.02s	remaining: 22.1s
36:	learn: 0.0270468	total: 3.12s	remaining: 22.1s
37:	learn: 0.0262795	total: 3.19s	remaining: 21.9s
38:	learn: 0.0256057	total: 3.28s	remaining: 21.9s
39:	learn: 0.0247950	total: 3.37s	remaining: 21.9s
40:	learn: 0.0241989	total: 3.47s	remaining: 21.8s
41:	learn: 0.0236409	total: 3.54s	remaining: 21.7s
42:	learn: 0.0232577	total: 3.65s	remaining: 21.7s
43:	learn: 0.0227390	total: 3.73s	remaining: 21.6s
44:	learn: 0.0223246	total: 3.81s	remaining: 21.5s
45:	learn: 0.0217517	total: 3.91s	remaining: 21.5s
46:	learn: 0.0212175	total: 3.98s	remaining: 21.4s
47:	learn: 0.0207656	total: 4.06s	remaining: 21.2s
48:	learn: 0.0202709	total: 4.15s	remaining: 21.2s
49:	learn: 0.0198565	total: 4.24s	remaining: 21.1s
50:	learn: 0.0196747	total: 4.31s	remaining: 21s
51:	learn: 0.0194230	total: 4.4s	remaining: 20.9s
52:	learn: 0.0192202	total: 4.49s	remaining: 20.8s
53:	learn: 0.0186679	total: 4.56s	remaining: 20.7s
54:	learn: 0.0184455	total: 4.66s	remaining: 20.7s
55:	learn: 0.0181326	total: 4.74s	remaining: 20.6s
56:	learn: 0.0177550	total: 4.82s	remaining: 20.5s
57:	learn: 0.0173619	total: 4.91s	remaining: 20.4s
58:	learn: 0.0171504	total: 4.99s	remaining: 20.3s
59:	learn: 0.0168759	total: 5.06s	remaining: 20.2s
60:	learn: 0.0163380	total: 5.16s	remaining: 20.1s
61:	learn: 0.0162096	total: 5.23s	remaining: 20s
62:	learn: 0.0160081	total: 5.31s	remaining: 19.9s
63:	learn: 0.0155049	total: 5.4s	remaining: 19.8s
64:	learn: 0.0152440	total: 5.5s	remaining: 19.8s
65:	learn: 0.0149513	total: 5.58s	remaining: 19.7s
66:	learn: 0.0147324	total: 5.68s	remaining: 19.7s
67:	learn: 0.0144031	total: 5.76s	remaining: 19.6s
68:	learn: 0.0141633	total: 5.85s	remaining: 19.5s
69:	learn: 0.0139264	total: 5.94s	remaining: 19.4s
70:	learn: 0.0137598	total: 6.01s	remaining: 19.3s
71:	learn: 0.0134979	total: 6.09s	remaining: 19.2s
72:	learn: 0.0133171	total: 6.19s	remaining: 19.2s
73:	learn: 0.0130282	total: 6.27s	remaining: 19.1s
74:	learn: 0.0128774	total: 6.34s	remaining: 18.9s
75:	learn: 0.0126695	total: 6.43s	remaining: 18.9s
76:	learn: 0.0124113	total: 6.53s	remaining: 18.8s
77:	learn: 0.0121857	total: 6.63s	remaining: 18.8s
78:	learn: 0.0118803	total: 6.73s	remaining: 18.7s
79:	learn: 0.0117803	total: 6.81s	remaining: 18.7s
80:	learn: 0.0116734	total: 6.89s	remaining: 18.5s
81:	learn: 0.0114537	total: 6.99s	remaining: 18.5s
82:	learn: 0.0112040	total: 7.08s	remaining: 18.4s
83:	learn: 0.0111038	total: 7.15s	remaining: 18.3s
84:	learn: 0.0108919	total: 7.25s	remaining: 18.2s
85:	learn: 0.0107453	total: 7.32s	remaining: 18.1s
86:	learn: 0.0105384	total: 7.42s	remaining: 18.1s
87:	learn: 0.0104527	total: 7.58s	remaining: 18.2s
88:	learn: 0.0102448	total: 7.74s	remaining: 18.3s
89:	learn: 0.0101405	total: 7.9s	remaining: 18.3s
90:	learn: 0.0098847	total: 8.05s	remaining: 18.4s
91:	learn: 0.0096923	total: 8.22s	remaining: 18.5s
92:	learn: 0.0095533	total: 8.37s	remaining: 18.5s
93:	learn: 0.0094045	total: 8.53s	remaining: 18.6s
94:	learn: 0.0092958	total: 8.69s	remaining: 18.7s
95:	learn: 0.0092219	total: 8.84s	remaining: 18.7s
96:	learn: 0.0091238	total: 8.94s	remaining: 18.6s
97:	learn: 0.0089865	total: 9.11s	remaining: 18.7s
98:	learn: 0.0088923	total: 9.24s	remaining: 18.7s
99:	learn: 0.0087985	total: 9.38s	remaining: 18.7s
100:	learn: 0.0086889	total: 9.54s	remaining: 18.7s
101:	learn: 0.0086079	total: 9.71s	remaining: 18.8s
102:	learn: 0.0084393	total: 9.88s	remaining: 18.8s
103:	learn: 0.0083662	total: 10s	remaining: 18.8s
104:	learn: 0.0083023	total: 10.2s	remaining: 18.8s
105:	learn: 0.0082373	total: 10.3s	remaining: 18.8s
106:	learn: 0.0080552	total: 10.5s	remaining: 18.8s
107:	learn: 0.0079568	total: 10.7s	remaining: 18.8s
108:	learn: 0.0078863	total: 10.8s	remaining: 18.8s
109:	learn: 0.0077062	total: 10.9s	remaining: 18.8s
110:	learn: 0.0076117	total: 11.1s	remaining: 18.8s
111:	learn: 0.0075756	total: 11.2s	remaining: 18.8s
112:	learn: 0.0074672	total: 11.4s	remaining: 18.7s
113:	learn: 0.0074013	total: 11.5s	remaining: 18.7s
114:	learn: 0.0073124	total: 11.7s	remaining: 18.7s
115:	learn: 0.0072478	total: 11.9s	remaining: 18.7s
116:	learn: 0.0071181	total: 12s	remaining: 18.7s
117:	learn: 0.0070340	total: 12.2s	remaining: 18.7s
118:	learn: 0.0069741	total: 12.3s	remaining: 18.6s
119:	learn: 0.0068717	total: 12.5s	remaining: 18.7s
120:	learn: 0.0068025	total: 12.7s	remaining: 18.6s
121:	learn: 0.0067334	total: 12.8s	remaining: 18.6s
122:	learn: 0.0066852	total: 13s	remaining: 18.6s
123:	learn: 0.0066303	total: 13.1s	remaining: 18.4s
124:	learn: 0.0065380	total: 13.1s	remaining: 18.3s
125:	learn: 0.0064465	total: 13.2s	remaining: 18.1s
126:	learn: 0.0063768	total: 13.3s	remaining: 18s
127:	learn: 0.0063310	total: 13.4s	remaining: 17.9s
128:	learn: 0.0062556	total: 13.5s	remaining: 17.7s
129:	learn: 0.0061841	total: 13.5s	remaining: 17.6s
130:	learn: 0.0061060	total: 13.6s	remaining: 17.5s
131:	learn: 0.0060181	total: 13.7s	remaining: 17.3s
132:	learn: 0.0059091	total: 13.8s	remaining: 17.2s
133:	learn: 0.0058504	total: 13.9s	remaining: 17.1s
134:	learn: 0.0057961	total: 14s	remaining: 17s
135:	learn: 0.0057470	total: 14.1s	remaining: 16.9s
136:	learn: 0.0057048	total: 14.2s	remaining: 16.7s
137:	learn: 0.0056440	total: 14.2s	remaining: 16.6s
138:	learn: 0.0055843	total: 14.3s	remaining: 16.5s
139:	learn: 0.0055495	total: 14.4s	remaining: 16.4s
140:	learn: 0.0054871	total: 14.5s	remaining: 16.2s
141:	learn: 0.0054541	total: 14.6s	remaining: 16.1s
142:	learn: 0.0054033	total: 14.7s	remaining: 16s
143:	learn: 0.0053606	total: 14.7s	remaining: 15.9s
144:	learn: 0.0053218	total: 14.8s	remaining: 15.7s
145:	learn: 0.0052741	total: 14.9s	remaining: 15.6s
146:	learn: 0.0052324	total: 15s	remaining: 15.5s
147:	learn: 0.0051677	total: 15.1s	remaining: 15.4s
148:	learn: 0.0050998	total: 15.2s	remaining: 15.3s
149:	learn: 0.0050601	total: 15.3s	remaining: 15.2s
150:	learn: 0.0050248	total: 15.4s	remaining: 15s
151:	learn: 0.0050012	total: 15.4s	remaining: 14.9s
152:	learn: 0.0049707	total: 15.5s	remaining: 14.8s
153:	learn: 0.0048969	total: 15.6s	remaining: 14.7s
154:	learn: 0.0048816	total: 15.7s	remaining: 14.6s
155:	learn: 0.0048148	total: 15.8s	remaining: 14.5s
156:	learn: 0.0047619	total: 15.9s	remaining: 14.4s
157:	learn: 0.0047404	total: 16s	remaining: 14.2s
158:	learn: 0.0047091	total: 16s	remaining: 14.1s
159:	learn: 0.0046465	total: 16.1s	remaining: 14s
160:	learn: 0.0046465	total: 16.2s	remaining: 13.9s
161:	learn: 0.0046465	total: 16.2s	remaining: 13.7s
162:	learn: 0.0045962	total: 16.3s	remaining: 13.6s
163:	learn: 0.0045567	total: 16.4s	remaining: 13.5s
164:	learn: 0.0045203	total: 16.5s	remaining: 13.4s
165:	learn: 0.0044479	total: 16.6s	remaining: 13.3s
166:	learn: 0.0044014	total: 16.7s	remaining: 13.2s
167:	learn: 0.0043797	total: 16.8s	remaining: 13.1s
168:	learn: 0.0043239	total: 16.9s	remaining: 13s
169:	learn: 0.0042821	total: 16.9s	remaining: 12.9s
170:	learn: 0.0042751	total: 17s	remaining: 12.7s
171:	learn: 0.0042751	total: 17.1s	remaining: 12.6s
172:	learn: 0.0042751	total: 17.2s	remaining: 12.5s
173:	learn: 0.0042391	total: 17.2s	remaining: 12.4s
174:	learn: 0.0042390	total: 17.3s	remaining: 12.3s
175:	learn: 0.0042218	total: 17.4s	remaining: 12.1s
176:	learn: 0.0041926	total: 17.4s	remaining: 12s
177:	learn: 0.0041223	total: 17.6s	remaining: 11.9s
178:	learn: 0.0040774	total: 17.6s	remaining: 11.8s
179:	learn: 0.0040596	total: 17.7s	remaining: 11.7s
180:	learn: 0.0040270	total: 17.8s	remaining: 11.6s
181:	learn: 0.0039779	total: 17.9s	remaining: 11.5s
182:	learn: 0.0039647	total: 18s	remaining: 11.4s
183:	learn: 0.0039435	total: 18s	remaining: 11.3s
184:	learn: 0.0038984	total: 18.1s	remaining: 11.2s
185:	learn: 0.0038590	total: 18.2s	remaining: 11.1s
186:	learn: 0.0038318	total: 18.3s	remaining: 11s
187:	learn: 0.0037954	total: 18.4s	remaining: 10.9s
188:	learn: 0.0037484	total: 18.5s	remaining: 10.7s
189:	learn: 0.0036973	total: 18.6s	remaining: 10.6s
190:	learn: 0.0036973	total: 18.6s	remaining: 10.5s
191:	learn: 0.0036974	total: 18.7s	remaining: 10.4s
192:	learn: 0.0036974	total: 18.7s	remaining: 10.3s
193:	learn: 0.0036974	total: 18.8s	remaining: 10.2s
194:	learn: 0.0036973	total: 18.9s	remaining: 10.1s
195:	learn: 0.0036974	total: 18.9s	remaining: 9.96s
196:	learn: 0.0036974	total: 19s	remaining: 9.85s
197:	learn: 0.0036973	total: 19.1s	remaining: 9.74s
198:	learn: 0.0036975	total: 19.2s	remaining: 9.62s
199:	learn: 0.0036973	total: 19.2s	remaining: 9.51s
200:	learn: 0.0036973	total: 19.3s	remaining: 9.39s
201:	learn: 0.0036972	total: 19.3s	remaining: 9.29s
202:	learn: 0.0036973	total: 19.4s	remaining: 9.17s
203:	learn: 0.0036973	total: 19.5s	remaining: 9.06s
204:	learn: 0.0036973	total: 19.5s	remaining: 8.95s
205:	learn: 0.0036974	total: 19.6s	remaining: 8.84s
206:	learn: 0.0036971	total: 19.6s	remaining: 8.73s
207:	learn: 0.0036974	total: 19.7s	remaining: 8.62s
208:	learn: 0.0036973	total: 19.8s	remaining: 8.51s
209:	learn: 0.0036972	total: 19.8s	remaining: 8.4s
210:	learn: 0.0036973	total: 19.9s	remaining: 8.29s
211:	learn: 0.0036972	total: 20s	remaining: 8.19s
212:	learn: 0.0036972	total: 20s	remaining: 8.08s
213:	learn: 0.0036973	total: 20.1s	remaining: 7.98s
214:	learn: 0.0036971	total: 20.2s	remaining: 7.87s
215:	learn: 0.0036972	total: 20.2s	remaining: 7.76s
216:	learn: 0.0036973	total: 20.3s	remaining: 7.66s
217:	learn: 0.0036971	total: 20.3s	remaining: 7.56s
218:	learn: 0.0036971	total: 20.4s	remaining: 7.45s
219:	learn: 0.0036972	total: 20.5s	remaining: 7.34s
220:	learn: 0.0036972	total: 20.5s	remaining: 7.24s
221:	learn: 0.0036972	total: 20.6s	remaining: 7.14s
222:	learn: 0.0036971	total: 20.6s	remaining: 7.04s
223:	learn: 0.0036971	total: 20.7s	remaining: 6.93s
224:	learn: 0.0036972	total: 20.8s	remaining: 6.83s
225:	learn: 0.0036971	total: 20.8s	remaining: 6.73s
226:	learn: 0.0036971	total: 20.9s	remaining: 6.63s
227:	learn: 0.0036972	total: 21s	remaining: 6.53s
228:	learn: 0.0036972	total: 21s	remaining: 6.42s
229:	learn: 0.0036970	total: 21.1s	remaining: 6.33s
230:	learn: 0.0036970	total: 21.2s	remaining: 6.23s
231:	learn: 0.0036971	total: 21.2s	remaining: 6.13s
232:	learn: 0.0036971	total: 21.3s	remaining: 6.02s
233:	learn: 0.0036971	total: 21.4s	remaining: 5.93s
234:	learn: 0.0036970	total: 21.4s	remaining: 5.83s
235:	learn: 0.0036970	total: 21.5s	remaining: 5.73s
236:	learn: 0.0036971	total: 21.5s	remaining: 5.63s
237:	learn: 0.0036971	total: 21.6s	remaining: 5.54s
238:	learn: 0.0036969	total: 21.7s	remaining: 5.44s
239:	learn: 0.0036970	total: 21.7s	remaining: 5.34s
240:	learn: 0.0036970	total: 21.8s	remaining: 5.24s
241:	learn: 0.0036969	total: 21.8s	remaining: 5.14s
242:	learn: 0.0036970	total: 21.9s	remaining: 5.05s
243:	learn: 0.0036969	total: 22s	remaining: 4.95s
244:	learn: 0.0036969	total: 22s	remaining: 4.85s
245:	learn: 0.0036969	total: 22.1s	remaining: 4.76s
246:	learn: 0.0036969	total: 22.2s	remaining: 4.67s
247:	learn: 0.0036971	total: 22.2s	remaining: 4.57s
248:	learn: 0.0036969	total: 22.3s	remaining: 4.47s
249:	learn: 0.0036968	total: 22.4s	remaining: 4.38s
250:	learn: 0.0036970	total: 22.4s	remaining: 4.29s
251:	learn: 0.0036969	total: 22.5s	remaining: 4.19s
252:	learn: 0.0036969	total: 22.5s	remaining: 4.09s
253:	learn: 0.0036970	total: 22.6s	remaining: 4s
254:	learn: 0.0036969	total: 22.7s	remaining: 3.91s
255:	learn: 0.0036970	total: 22.7s	remaining: 3.81s
256:	learn: 0.0036970	total: 22.8s	remaining: 3.72s
257:	learn: 0.0036969	total: 22.8s	remaining: 3.63s
258:	learn: 0.0036969	total: 22.9s	remaining: 3.54s
259:	learn: 0.0036969	total: 23s	remaining: 3.45s
260:	learn: 0.0036969	total: 23.1s	remaining: 3.37s
261:	learn: 0.0036969	total: 23.2s	remaining: 3.28s
262:	learn: 0.0036968	total: 23.3s	remaining: 3.19s
263:	learn: 0.0036969	total: 23.4s	remaining: 3.11s
264:	learn: 0.0036969	total: 23.6s	remaining: 3.02s
265:	learn: 0.0036968	total: 23.7s	remaining: 2.94s
266:	learn: 0.0036968	total: 23.8s	remaining: 2.85s
267:	learn: 0.0036968	total: 23.9s	remaining: 2.77s
268:	learn: 0.0036970	total: 24s	remaining: 2.68s
269:	learn: 0.0036968	total: 24.1s	remaining: 2.59s
270:	learn: 0.0036968	total: 24.3s	remaining: 2.51s
271:	learn: 0.0036968	total: 24.4s	remaining: 2.42s
272:	learn: 0.0036969	total: 24.5s	remaining: 2.33s
273:	learn: 0.0036967	total: 24.6s	remaining: 2.25s
274:	learn: 0.0036967	total: 24.7s	remaining: 2.16s
275:	learn: 0.0036968	total: 24.8s	remaining: 2.07s
276:	learn: 0.0036968	total: 25s	remaining: 1.98s
277:	learn: 0.0036968	total: 25.1s	remaining: 1.9s
278:	learn: 0.0036968	total: 25.2s	remaining: 1.81s
279:	learn: 0.0036966	total: 25.3s	remaining: 1.72s
280:	learn: 0.0036969	total: 25.4s	remaining: 1.63s
281:	learn: 0.0036967	total: 25.5s	remaining: 1.54s
282:	learn: 0.0036967	total: 25.7s	remaining: 1.45s
283:	learn: 0.0036967	total: 25.8s	remaining: 1.36s
284:	learn: 0.0036967	total: 25.9s	remaining: 1.27s
285:	learn: 0.0036967	total: 26s	remaining: 1.18s
286:	learn: 0.0036965	total: 26.2s	remaining: 1.09s
287:	learn: 0.0036966	total: 26.3s	remaining: 1s
288:	learn: 0.0036965	total: 26.4s	remaining: 914ms
289:	learn: 0.0036966	total: 26.5s	remaining: 824ms
290:	learn: 0.0036966	total: 26.7s	remaining: 734ms
291:	learn: 0.0036967	total: 26.8s	remaining: 642ms
292:	learn: 0.0036966	total: 26.9s	remaining: 551ms
293:	learn: 0.0036964	total: 27s	remaining: 460ms
294:	learn: 0.0036964	total: 27.1s	remaining: 368ms
295:	learn: 0.0036966	total: 27.3s	remaining: 276ms
296:	learn: 0.0036965	total: 27.4s	remaining: 184ms
297:	learn: 0.0036965	total: 27.5s	remaining: 92.3ms
298:	learn: 0.0036965	total: 27.6s	remaining: 0us
[I 2024-12-19 14:30:05,104] Trial 19 finished with value: 74.99710469798667 and parameters: {'learning_rate': 0.08233568205551695, 'max_depth': 5, 'n_estimators': 299, 'scale_pos_weight': 11.25520316622741}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.18
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.59
 - F1-Score_Train: 99.59
 - Precision_Test: 14.59
 - Recall_Test: 85.71
 - AUPRC_Test: 72.72
 - Accuracy_Test: 99.13
 - F1-Score_Test: 24.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 299
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 11.26
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 74.9971

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5230593	total: 72.7ms	remaining: 18.1s
1:	learn: 0.3998375	total: 140ms	remaining: 17.3s
2:	learn: 0.3197351	total: 213ms	remaining: 17.6s
3:	learn: 0.2510058	total: 326ms	remaining: 20s
4:	learn: 0.2065171	total: 399ms	remaining: 19.6s
5:	learn: 0.1706473	total: 478ms	remaining: 19.4s
6:	learn: 0.1553893	total: 561ms	remaining: 19.5s
7:	learn: 0.1328442	total: 635ms	remaining: 19.2s
8:	learn: 0.1192944	total: 710ms	remaining: 19s
9:	learn: 0.1124113	total: 796ms	remaining: 19.1s
10:	learn: 0.1042504	total: 878ms	remaining: 19.1s
11:	learn: 0.0968324	total: 951ms	remaining: 18.9s
12:	learn: 0.0921979	total: 1.04s	remaining: 18.9s
13:	learn: 0.0887704	total: 1.11s	remaining: 18.8s
14:	learn: 0.0845434	total: 1.18s	remaining: 18.5s
15:	learn: 0.0799665	total: 1.26s	remaining: 18.5s
16:	learn: 0.0759337	total: 1.36s	remaining: 18.6s
17:	learn: 0.0731242	total: 1.43s	remaining: 18.4s
18:	learn: 0.0695608	total: 1.51s	remaining: 18.4s
19:	learn: 0.0676293	total: 1.58s	remaining: 18.2s
20:	learn: 0.0654946	total: 1.65s	remaining: 18s
21:	learn: 0.0640004	total: 1.72s	remaining: 17.8s
22:	learn: 0.0614201	total: 1.8s	remaining: 17.8s
23:	learn: 0.0594278	total: 1.88s	remaining: 17.7s
24:	learn: 0.0576304	total: 1.96s	remaining: 17.6s
25:	learn: 0.0557420	total: 2.03s	remaining: 17.5s
26:	learn: 0.0542326	total: 2.1s	remaining: 17.4s
27:	learn: 0.0526831	total: 2.18s	remaining: 17.3s
28:	learn: 0.0508600	total: 2.26s	remaining: 17.2s
29:	learn: 0.0492622	total: 2.35s	remaining: 17.2s
30:	learn: 0.0481254	total: 2.43s	remaining: 17.2s
31:	learn: 0.0473452	total: 2.5s	remaining: 17s
32:	learn: 0.0458413	total: 2.57s	remaining: 16.9s
33:	learn: 0.0447534	total: 2.65s	remaining: 16.8s
34:	learn: 0.0436712	total: 2.73s	remaining: 16.8s
35:	learn: 0.0429367	total: 2.8s	remaining: 16.7s
36:	learn: 0.0419832	total: 2.88s	remaining: 16.6s
37:	learn: 0.0411626	total: 2.95s	remaining: 16.5s
38:	learn: 0.0402583	total: 3.02s	remaining: 16.3s
39:	learn: 0.0387677	total: 3.11s	remaining: 16.3s
40:	learn: 0.0381658	total: 3.18s	remaining: 16.2s
41:	learn: 0.0373592	total: 3.25s	remaining: 16.1s
42:	learn: 0.0365224	total: 3.35s	remaining: 16.1s
43:	learn: 0.0358059	total: 3.44s	remaining: 16.1s
44:	learn: 0.0351774	total: 3.5s	remaining: 15.9s
45:	learn: 0.0347238	total: 3.58s	remaining: 15.9s
46:	learn: 0.0340192	total: 3.65s	remaining: 15.8s
47:	learn: 0.0333613	total: 3.72s	remaining: 15.7s
48:	learn: 0.0327742	total: 3.81s	remaining: 15.6s
49:	learn: 0.0324521	total: 3.87s	remaining: 15.5s
50:	learn: 0.0318237	total: 3.94s	remaining: 15.4s
51:	learn: 0.0313159	total: 4.02s	remaining: 15.3s
52:	learn: 0.0308136	total: 4.11s	remaining: 15.3s
53:	learn: 0.0303603	total: 4.18s	remaining: 15.2s
54:	learn: 0.0299261	total: 4.26s	remaining: 15.1s
55:	learn: 0.0295482	total: 4.34s	remaining: 15s
56:	learn: 0.0291531	total: 4.42s	remaining: 15s
57:	learn: 0.0287136	total: 4.5s	remaining: 14.9s
58:	learn: 0.0283050	total: 4.57s	remaining: 14.8s
59:	learn: 0.0279072	total: 4.64s	remaining: 14.7s
60:	learn: 0.0272166	total: 4.73s	remaining: 14.7s
61:	learn: 0.0268482	total: 4.8s	remaining: 14.5s
62:	learn: 0.0263435	total: 4.87s	remaining: 14.5s
63:	learn: 0.0260123	total: 4.95s	remaining: 14.4s
64:	learn: 0.0255829	total: 5.02s	remaining: 14.3s
65:	learn: 0.0253811	total: 5.1s	remaining: 14.2s
66:	learn: 0.0249083	total: 5.17s	remaining: 14.1s
67:	learn: 0.0245381	total: 5.25s	remaining: 14.1s
68:	learn: 0.0241642	total: 5.32s	remaining: 14s
69:	learn: 0.0238041	total: 5.41s	remaining: 13.9s
70:	learn: 0.0233058	total: 5.49s	remaining: 13.8s
71:	learn: 0.0229922	total: 5.56s	remaining: 13.7s
72:	learn: 0.0227582	total: 5.64s	remaining: 13.7s
73:	learn: 0.0225362	total: 5.71s	remaining: 13.6s
74:	learn: 0.0221731	total: 5.79s	remaining: 13.5s
75:	learn: 0.0218540	total: 5.88s	remaining: 13.5s
76:	learn: 0.0215332	total: 5.96s	remaining: 13.4s
77:	learn: 0.0213216	total: 6.03s	remaining: 13.3s
78:	learn: 0.0211219	total: 6.11s	remaining: 13.2s
79:	learn: 0.0207874	total: 6.18s	remaining: 13.1s
80:	learn: 0.0205976	total: 6.24s	remaining: 13s
81:	learn: 0.0203838	total: 6.34s	remaining: 13s
82:	learn: 0.0201761	total: 6.41s	remaining: 12.9s
83:	learn: 0.0198745	total: 6.5s	remaining: 12.8s
84:	learn: 0.0196060	total: 6.58s	remaining: 12.8s
85:	learn: 0.0193330	total: 6.67s	remaining: 12.7s
86:	learn: 0.0190968	total: 6.75s	remaining: 12.6s
87:	learn: 0.0188178	total: 6.83s	remaining: 12.6s
88:	learn: 0.0185126	total: 6.91s	remaining: 12.5s
89:	learn: 0.0182779	total: 6.98s	remaining: 12.4s
90:	learn: 0.0181331	total: 7.06s	remaining: 12.3s
91:	learn: 0.0179423	total: 7.13s	remaining: 12.3s
92:	learn: 0.0177359	total: 7.2s	remaining: 12.2s
93:	learn: 0.0174487	total: 7.3s	remaining: 12.1s
94:	learn: 0.0172769	total: 7.36s	remaining: 12s
95:	learn: 0.0170762	total: 7.43s	remaining: 11.9s
96:	learn: 0.0168928	total: 7.53s	remaining: 11.9s
97:	learn: 0.0168080	total: 7.6s	remaining: 11.8s
98:	learn: 0.0165185	total: 7.68s	remaining: 11.7s
99:	learn: 0.0164283	total: 7.76s	remaining: 11.6s
100:	learn: 0.0161675	total: 7.84s	remaining: 11.6s
101:	learn: 0.0160188	total: 7.91s	remaining: 11.5s
102:	learn: 0.0159184	total: 7.99s	remaining: 11.4s
103:	learn: 0.0157516	total: 8.07s	remaining: 11.3s
104:	learn: 0.0155373	total: 8.14s	remaining: 11.2s
105:	learn: 0.0152797	total: 8.23s	remaining: 11.2s
106:	learn: 0.0150925	total: 8.34s	remaining: 11.1s
107:	learn: 0.0149241	total: 8.49s	remaining: 11.2s
108:	learn: 0.0148405	total: 8.63s	remaining: 11.2s
109:	learn: 0.0146721	total: 8.76s	remaining: 11.2s
110:	learn: 0.0145156	total: 8.9s	remaining: 11.1s
111:	learn: 0.0143950	total: 9.05s	remaining: 11.2s
112:	learn: 0.0142882	total: 9.19s	remaining: 11.1s
113:	learn: 0.0140886	total: 9.32s	remaining: 11.1s
114:	learn: 0.0138673	total: 9.46s	remaining: 11.1s
115:	learn: 0.0137448	total: 9.62s	remaining: 11.1s
116:	learn: 0.0136319	total: 9.75s	remaining: 11.1s
117:	learn: 0.0135368	total: 9.89s	remaining: 11.1s
118:	learn: 0.0133625	total: 10s	remaining: 11s
119:	learn: 0.0131810	total: 10.1s	remaining: 11s
120:	learn: 0.0130740	total: 10.3s	remaining: 11s
121:	learn: 0.0129628	total: 10.4s	remaining: 10.9s
122:	learn: 0.0128212	total: 10.5s	remaining: 10.9s
123:	learn: 0.0126898	total: 10.7s	remaining: 10.9s
124:	learn: 0.0125827	total: 10.8s	remaining: 10.8s
125:	learn: 0.0125051	total: 11s	remaining: 10.8s
126:	learn: 0.0123204	total: 11.1s	remaining: 10.8s
127:	learn: 0.0121798	total: 11.3s	remaining: 10.8s
128:	learn: 0.0120311	total: 11.4s	remaining: 10.7s
129:	learn: 0.0119091	total: 11.6s	remaining: 10.7s
130:	learn: 0.0117886	total: 11.7s	remaining: 10.6s
131:	learn: 0.0116738	total: 11.8s	remaining: 10.6s
132:	learn: 0.0115857	total: 12s	remaining: 10.5s
133:	learn: 0.0115113	total: 12.1s	remaining: 10.5s
134:	learn: 0.0113768	total: 12.3s	remaining: 10.5s
135:	learn: 0.0112573	total: 12.4s	remaining: 10.4s
136:	learn: 0.0111911	total: 12.5s	remaining: 10.3s
137:	learn: 0.0111163	total: 12.7s	remaining: 10.3s
138:	learn: 0.0110119	total: 12.8s	remaining: 10.3s
139:	learn: 0.0109394	total: 13s	remaining: 10.2s
140:	learn: 0.0108569	total: 13.1s	remaining: 10.2s
141:	learn: 0.0107414	total: 13.3s	remaining: 10.1s
142:	learn: 0.0106075	total: 13.4s	remaining: 10.1s
143:	learn: 0.0105432	total: 13.6s	remaining: 10s
144:	learn: 0.0104012	total: 13.7s	remaining: 9.93s
145:	learn: 0.0103237	total: 13.8s	remaining: 9.82s
146:	learn: 0.0102156	total: 13.9s	remaining: 9.72s
147:	learn: 0.0101352	total: 13.9s	remaining: 9.61s
148:	learn: 0.0100814	total: 14s	remaining: 9.5s
149:	learn: 0.0099788	total: 14.1s	remaining: 9.4s
150:	learn: 0.0098548	total: 14.2s	remaining: 9.3s
151:	learn: 0.0097811	total: 14.3s	remaining: 9.19s
152:	learn: 0.0096841	total: 14.3s	remaining: 9.09s
153:	learn: 0.0096015	total: 14.4s	remaining: 8.98s
154:	learn: 0.0095159	total: 14.5s	remaining: 8.87s
155:	learn: 0.0094101	total: 14.6s	remaining: 8.77s
156:	learn: 0.0093436	total: 14.6s	remaining: 8.67s
157:	learn: 0.0092530	total: 14.7s	remaining: 8.56s
158:	learn: 0.0091194	total: 14.8s	remaining: 8.47s
159:	learn: 0.0090766	total: 14.9s	remaining: 8.35s
160:	learn: 0.0090510	total: 14.9s	remaining: 8.25s
161:	learn: 0.0090259	total: 15s	remaining: 8.14s
162:	learn: 0.0089390	total: 15.1s	remaining: 8.04s
163:	learn: 0.0088761	total: 15.1s	remaining: 7.94s
164:	learn: 0.0088162	total: 15.2s	remaining: 7.84s
165:	learn: 0.0087870	total: 15.3s	remaining: 7.75s
166:	learn: 0.0087444	total: 15.4s	remaining: 7.65s
167:	learn: 0.0087002	total: 15.5s	remaining: 7.54s
168:	learn: 0.0086199	total: 15.5s	remaining: 7.45s
169:	learn: 0.0085644	total: 15.6s	remaining: 7.35s
170:	learn: 0.0085091	total: 15.7s	remaining: 7.25s
171:	learn: 0.0084565	total: 15.8s	remaining: 7.15s
172:	learn: 0.0083745	total: 15.9s	remaining: 7.06s
173:	learn: 0.0082983	total: 15.9s	remaining: 6.96s
174:	learn: 0.0081738	total: 16s	remaining: 6.87s
175:	learn: 0.0080786	total: 16.1s	remaining: 6.77s
176:	learn: 0.0080080	total: 16.2s	remaining: 6.67s
177:	learn: 0.0079670	total: 16.3s	remaining: 6.57s
178:	learn: 0.0079118	total: 16.3s	remaining: 6.48s
179:	learn: 0.0078738	total: 16.4s	remaining: 6.38s
180:	learn: 0.0077974	total: 16.5s	remaining: 6.28s
181:	learn: 0.0077353	total: 16.6s	remaining: 6.19s
182:	learn: 0.0076793	total: 16.6s	remaining: 6.09s
183:	learn: 0.0076301	total: 16.7s	remaining: 6s
184:	learn: 0.0075788	total: 16.8s	remaining: 5.9s
185:	learn: 0.0075300	total: 16.9s	remaining: 5.81s
186:	learn: 0.0074801	total: 17s	remaining: 5.72s
187:	learn: 0.0074400	total: 17s	remaining: 5.62s
188:	learn: 0.0073592	total: 17.1s	remaining: 5.52s
189:	learn: 0.0073242	total: 17.2s	remaining: 5.43s
190:	learn: 0.0072465	total: 17.3s	remaining: 5.33s
191:	learn: 0.0071823	total: 17.3s	remaining: 5.24s
192:	learn: 0.0071125	total: 17.4s	remaining: 5.14s
193:	learn: 0.0070768	total: 17.5s	remaining: 5.05s
194:	learn: 0.0070128	total: 17.6s	remaining: 4.96s
195:	learn: 0.0069600	total: 17.7s	remaining: 4.86s
196:	learn: 0.0069137	total: 17.7s	remaining: 4.77s
197:	learn: 0.0068933	total: 17.8s	remaining: 4.67s
198:	learn: 0.0068794	total: 17.9s	remaining: 4.58s
199:	learn: 0.0068330	total: 17.9s	remaining: 4.49s
200:	learn: 0.0068048	total: 18s	remaining: 4.4s
201:	learn: 0.0067158	total: 18.1s	remaining: 4.3s
202:	learn: 0.0066686	total: 18.2s	remaining: 4.21s
203:	learn: 0.0066373	total: 18.3s	remaining: 4.12s
204:	learn: 0.0066093	total: 18.3s	remaining: 4.03s
205:	learn: 0.0065516	total: 18.4s	remaining: 3.93s
206:	learn: 0.0065045	total: 18.5s	remaining: 3.84s
207:	learn: 0.0064581	total: 18.5s	remaining: 3.75s
208:	learn: 0.0064241	total: 18.6s	remaining: 3.65s
209:	learn: 0.0063622	total: 18.7s	remaining: 3.56s
210:	learn: 0.0063088	total: 18.8s	remaining: 3.47s
211:	learn: 0.0062706	total: 18.8s	remaining: 3.38s
212:	learn: 0.0062271	total: 18.9s	remaining: 3.29s
213:	learn: 0.0061817	total: 19s	remaining: 3.19s
214:	learn: 0.0061510	total: 19.1s	remaining: 3.1s
215:	learn: 0.0061126	total: 19.2s	remaining: 3.01s
216:	learn: 0.0060561	total: 19.3s	remaining: 2.93s
217:	learn: 0.0060206	total: 19.3s	remaining: 2.83s
218:	learn: 0.0060069	total: 19.4s	remaining: 2.74s
219:	learn: 0.0059620	total: 19.4s	remaining: 2.65s
220:	learn: 0.0059327	total: 19.5s	remaining: 2.56s
221:	learn: 0.0059327	total: 19.6s	remaining: 2.47s
222:	learn: 0.0059084	total: 19.6s	remaining: 2.38s
223:	learn: 0.0058676	total: 19.7s	remaining: 2.29s
224:	learn: 0.0058676	total: 19.8s	remaining: 2.2s
225:	learn: 0.0058676	total: 19.8s	remaining: 2.11s
226:	learn: 0.0058318	total: 19.9s	remaining: 2.02s
227:	learn: 0.0058317	total: 19.9s	remaining: 1.92s
228:	learn: 0.0058318	total: 20s	remaining: 1.83s
229:	learn: 0.0058084	total: 20.1s	remaining: 1.75s
230:	learn: 0.0057888	total: 20.2s	remaining: 1.66s
231:	learn: 0.0057642	total: 20.2s	remaining: 1.57s
232:	learn: 0.0057206	total: 20.3s	remaining: 1.48s
233:	learn: 0.0056780	total: 20.4s	remaining: 1.39s
234:	learn: 0.0056312	total: 20.5s	remaining: 1.31s
235:	learn: 0.0055997	total: 20.5s	remaining: 1.22s
236:	learn: 0.0055359	total: 20.6s	remaining: 1.13s
237:	learn: 0.0055037	total: 20.7s	remaining: 1.04s
238:	learn: 0.0054709	total: 20.8s	remaining: 956ms
239:	learn: 0.0054121	total: 20.8s	remaining: 868ms
240:	learn: 0.0053923	total: 20.9s	remaining: 780ms
241:	learn: 0.0053749	total: 21s	remaining: 693ms
242:	learn: 0.0053749	total: 21s	remaining: 606ms
243:	learn: 0.0053749	total: 21.1s	remaining: 519ms
244:	learn: 0.0053749	total: 21.2s	remaining: 432ms
245:	learn: 0.0053749	total: 21.3s	remaining: 346ms
246:	learn: 0.0053749	total: 21.3s	remaining: 259ms
247:	learn: 0.0053749	total: 21.4s	remaining: 172ms
248:	learn: 0.0053749	total: 21.4s	remaining: 86ms
249:	learn: 0.0053749	total: 21.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.51
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 19.67
 - Recall_Test: 84.92
 - AUPRC_Test: 76.40
 - Accuracy_Test: 99.39
 - F1-Score_Test: 31.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 250
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.04
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5327678	total: 109ms	remaining: 27s
1:	learn: 0.4150200	total: 225ms	remaining: 27.9s
2:	learn: 0.3279974	total: 362ms	remaining: 29.8s
3:	learn: 0.2626188	total: 495ms	remaining: 30.5s
4:	learn: 0.2317722	total: 660ms	remaining: 32.3s
5:	learn: 0.2109161	total: 805ms	remaining: 32.8s
6:	learn: 0.1872457	total: 949ms	remaining: 32.9s
7:	learn: 0.1690281	total: 1.09s	remaining: 33s
8:	learn: 0.1562828	total: 1.26s	remaining: 33.7s
9:	learn: 0.1443522	total: 1.41s	remaining: 33.9s
10:	learn: 0.1366424	total: 1.53s	remaining: 33.3s
11:	learn: 0.1307257	total: 1.69s	remaining: 33.5s
12:	learn: 0.1240901	total: 1.84s	remaining: 33.6s
13:	learn: 0.1195970	total: 2s	remaining: 33.7s
14:	learn: 0.1150481	total: 2.14s	remaining: 33.5s
15:	learn: 0.1112069	total: 2.27s	remaining: 33.2s
16:	learn: 0.1066228	total: 2.42s	remaining: 33.2s
17:	learn: 0.1032137	total: 2.56s	remaining: 33s
18:	learn: 0.0985788	total: 2.72s	remaining: 33.1s
19:	learn: 0.0953269	total: 2.87s	remaining: 33s
20:	learn: 0.0923948	total: 3.04s	remaining: 33.2s
21:	learn: 0.0903220	total: 3.19s	remaining: 33s
22:	learn: 0.0875996	total: 3.33s	remaining: 32.9s
23:	learn: 0.0856592	total: 3.47s	remaining: 32.7s
24:	learn: 0.0843489	total: 3.61s	remaining: 32.5s
25:	learn: 0.0826868	total: 3.73s	remaining: 32.1s
26:	learn: 0.0806759	total: 3.86s	remaining: 31.9s
27:	learn: 0.0786658	total: 3.99s	remaining: 31.6s
28:	learn: 0.0768992	total: 4.13s	remaining: 31.5s
29:	learn: 0.0752334	total: 4.27s	remaining: 31.3s
30:	learn: 0.0734736	total: 4.4s	remaining: 31.1s
31:	learn: 0.0715083	total: 4.55s	remaining: 31s
32:	learn: 0.0701274	total: 4.69s	remaining: 30.9s
33:	learn: 0.0688747	total: 4.84s	remaining: 30.8s
34:	learn: 0.0670775	total: 4.96s	remaining: 30.5s
35:	learn: 0.0655468	total: 5.09s	remaining: 30.3s
36:	learn: 0.0640856	total: 5.26s	remaining: 30.3s
37:	learn: 0.0629647	total: 5.39s	remaining: 30.1s
38:	learn: 0.0618799	total: 5.54s	remaining: 30s
39:	learn: 0.0607844	total: 5.68s	remaining: 29.8s
40:	learn: 0.0599213	total: 5.82s	remaining: 29.7s
41:	learn: 0.0578296	total: 5.91s	remaining: 29.3s
42:	learn: 0.0570865	total: 5.98s	remaining: 28.8s
43:	learn: 0.0562930	total: 6.06s	remaining: 28.4s
44:	learn: 0.0553890	total: 6.13s	remaining: 27.9s
45:	learn: 0.0545687	total: 6.21s	remaining: 27.5s
46:	learn: 0.0539199	total: 6.29s	remaining: 27.1s
47:	learn: 0.0530203	total: 6.36s	remaining: 26.8s
48:	learn: 0.0524764	total: 6.43s	remaining: 26.4s
49:	learn: 0.0512097	total: 6.51s	remaining: 26s
50:	learn: 0.0504289	total: 6.59s	remaining: 25.7s
51:	learn: 0.0497551	total: 6.67s	remaining: 25.4s
52:	learn: 0.0491641	total: 6.75s	remaining: 25.1s
53:	learn: 0.0485048	total: 6.82s	remaining: 24.8s
54:	learn: 0.0478461	total: 6.9s	remaining: 24.5s
55:	learn: 0.0471832	total: 7s	remaining: 24.3s
56:	learn: 0.0462663	total: 7.08s	remaining: 24s
57:	learn: 0.0455280	total: 7.15s	remaining: 23.7s
58:	learn: 0.0449951	total: 7.23s	remaining: 23.4s
59:	learn: 0.0442344	total: 7.3s	remaining: 23.1s
60:	learn: 0.0432553	total: 7.38s	remaining: 22.9s
61:	learn: 0.0425321	total: 7.45s	remaining: 22.6s
62:	learn: 0.0419064	total: 7.52s	remaining: 22.3s
63:	learn: 0.0410827	total: 7.6s	remaining: 22.1s
64:	learn: 0.0403051	total: 7.68s	remaining: 21.9s
65:	learn: 0.0399170	total: 7.75s	remaining: 21.6s
66:	learn: 0.0394457	total: 7.82s	remaining: 21.4s
67:	learn: 0.0391210	total: 7.89s	remaining: 21.1s
68:	learn: 0.0385796	total: 8.01s	remaining: 21s
69:	learn: 0.0378621	total: 8.07s	remaining: 20.8s
70:	learn: 0.0371356	total: 8.16s	remaining: 20.6s
71:	learn: 0.0366628	total: 8.23s	remaining: 20.4s
72:	learn: 0.0360152	total: 8.31s	remaining: 20.1s
73:	learn: 0.0354306	total: 8.39s	remaining: 20s
74:	learn: 0.0348445	total: 8.47s	remaining: 19.8s
75:	learn: 0.0345430	total: 8.54s	remaining: 19.6s
76:	learn: 0.0339030	total: 8.64s	remaining: 19.4s
77:	learn: 0.0334194	total: 8.7s	remaining: 19.2s
78:	learn: 0.0329302	total: 8.77s	remaining: 19s
79:	learn: 0.0324246	total: 8.84s	remaining: 18.8s
80:	learn: 0.0321552	total: 8.92s	remaining: 18.6s
81:	learn: 0.0318336	total: 9.01s	remaining: 18.5s
82:	learn: 0.0313510	total: 9.09s	remaining: 18.3s
83:	learn: 0.0309619	total: 9.17s	remaining: 18.1s
84:	learn: 0.0306944	total: 9.24s	remaining: 17.9s
85:	learn: 0.0302956	total: 9.32s	remaining: 17.8s
86:	learn: 0.0300172	total: 9.39s	remaining: 17.6s
87:	learn: 0.0297503	total: 9.46s	remaining: 17.4s
88:	learn: 0.0294700	total: 9.54s	remaining: 17.3s
89:	learn: 0.0290664	total: 9.62s	remaining: 17.1s
90:	learn: 0.0285905	total: 9.69s	remaining: 16.9s
91:	learn: 0.0283166	total: 9.78s	remaining: 16.8s
92:	learn: 0.0280301	total: 9.84s	remaining: 16.6s
93:	learn: 0.0277776	total: 9.91s	remaining: 16.4s
94:	learn: 0.0274800	total: 10s	remaining: 16.3s
95:	learn: 0.0272407	total: 10.1s	remaining: 16.2s
96:	learn: 0.0270399	total: 10.2s	remaining: 16.1s
97:	learn: 0.0266861	total: 10.3s	remaining: 16s
98:	learn: 0.0262945	total: 10.4s	remaining: 15.8s
99:	learn: 0.0260673	total: 10.4s	remaining: 15.6s
100:	learn: 0.0258811	total: 10.5s	remaining: 15.5s
101:	learn: 0.0255493	total: 10.6s	remaining: 15.4s
102:	learn: 0.0252129	total: 10.7s	remaining: 15.2s
103:	learn: 0.0249612	total: 10.8s	remaining: 15.1s
104:	learn: 0.0246489	total: 10.8s	remaining: 14.9s
105:	learn: 0.0244714	total: 10.9s	remaining: 14.8s
106:	learn: 0.0242797	total: 11s	remaining: 14.7s
107:	learn: 0.0238895	total: 11s	remaining: 14.5s
108:	learn: 0.0236774	total: 11.1s	remaining: 14.4s
109:	learn: 0.0234994	total: 11.2s	remaining: 14.3s
110:	learn: 0.0232384	total: 11.3s	remaining: 14.1s
111:	learn: 0.0229983	total: 11.3s	remaining: 14s
112:	learn: 0.0228410	total: 11.4s	remaining: 13.8s
113:	learn: 0.0225622	total: 11.5s	remaining: 13.7s
114:	learn: 0.0224556	total: 11.6s	remaining: 13.6s
115:	learn: 0.0221627	total: 11.6s	remaining: 13.5s
116:	learn: 0.0218613	total: 11.7s	remaining: 13.3s
117:	learn: 0.0217193	total: 11.8s	remaining: 13.2s
118:	learn: 0.0215036	total: 11.9s	remaining: 13.1s
119:	learn: 0.0212310	total: 11.9s	remaining: 12.9s
120:	learn: 0.0209711	total: 12s	remaining: 12.8s
121:	learn: 0.0208038	total: 12.1s	remaining: 12.7s
122:	learn: 0.0204715	total: 12.2s	remaining: 12.6s
123:	learn: 0.0202981	total: 12.3s	remaining: 12.5s
124:	learn: 0.0201621	total: 12.3s	remaining: 12.3s
125:	learn: 0.0200018	total: 12.4s	remaining: 12.2s
126:	learn: 0.0197337	total: 12.5s	remaining: 12.1s
127:	learn: 0.0194050	total: 12.6s	remaining: 12s
128:	learn: 0.0192386	total: 12.7s	remaining: 11.9s
129:	learn: 0.0190523	total: 12.7s	remaining: 11.8s
130:	learn: 0.0188566	total: 12.8s	remaining: 11.6s
131:	learn: 0.0186099	total: 12.9s	remaining: 11.5s
132:	learn: 0.0184619	total: 13s	remaining: 11.4s
133:	learn: 0.0182568	total: 13s	remaining: 11.3s
134:	learn: 0.0180876	total: 13.1s	remaining: 11.2s
135:	learn: 0.0178716	total: 13.2s	remaining: 11.1s
136:	learn: 0.0177883	total: 13.3s	remaining: 10.9s
137:	learn: 0.0175484	total: 13.3s	remaining: 10.8s
138:	learn: 0.0174095	total: 13.4s	remaining: 10.7s
139:	learn: 0.0172360	total: 13.5s	remaining: 10.6s
140:	learn: 0.0170567	total: 13.6s	remaining: 10.5s
141:	learn: 0.0168201	total: 13.7s	remaining: 10.4s
142:	learn: 0.0167087	total: 13.7s	remaining: 10.3s
143:	learn: 0.0165507	total: 13.8s	remaining: 10.2s
144:	learn: 0.0163768	total: 13.9s	remaining: 10s
145:	learn: 0.0162574	total: 13.9s	remaining: 9.93s
146:	learn: 0.0161012	total: 14s	remaining: 9.83s
147:	learn: 0.0160045	total: 14.1s	remaining: 9.71s
148:	learn: 0.0157892	total: 14.2s	remaining: 9.6s
149:	learn: 0.0156783	total: 14.3s	remaining: 9.51s
150:	learn: 0.0154738	total: 14.3s	remaining: 9.39s
151:	learn: 0.0152140	total: 14.4s	remaining: 9.28s
152:	learn: 0.0151599	total: 14.5s	remaining: 9.17s
153:	learn: 0.0149772	total: 14.6s	remaining: 9.07s
154:	learn: 0.0149413	total: 14.6s	remaining: 8.95s
155:	learn: 0.0147898	total: 14.7s	remaining: 8.85s
156:	learn: 0.0146613	total: 14.8s	remaining: 8.74s
157:	learn: 0.0145947	total: 14.8s	remaining: 8.63s
158:	learn: 0.0144653	total: 14.9s	remaining: 8.53s
159:	learn: 0.0143541	total: 15s	remaining: 8.42s
160:	learn: 0.0142223	total: 15.1s	remaining: 8.32s
161:	learn: 0.0140682	total: 15.1s	remaining: 8.22s
162:	learn: 0.0139820	total: 15.2s	remaining: 8.13s
163:	learn: 0.0138893	total: 15.3s	remaining: 8.02s
164:	learn: 0.0137488	total: 15.4s	remaining: 7.92s
165:	learn: 0.0136397	total: 15.4s	remaining: 7.82s
166:	learn: 0.0135273	total: 15.5s	remaining: 7.71s
167:	learn: 0.0134378	total: 15.6s	remaining: 7.61s
168:	learn: 0.0134112	total: 15.7s	remaining: 7.5s
169:	learn: 0.0133276	total: 15.7s	remaining: 7.4s
170:	learn: 0.0132087	total: 15.8s	remaining: 7.29s
171:	learn: 0.0131412	total: 15.9s	remaining: 7.21s
172:	learn: 0.0130308	total: 16s	remaining: 7.13s
173:	learn: 0.0129617	total: 16.2s	remaining: 7.06s
174:	learn: 0.0129095	total: 16.3s	remaining: 6.98s
175:	learn: 0.0127382	total: 16.4s	remaining: 6.9s
176:	learn: 0.0126365	total: 16.5s	remaining: 6.82s
177:	learn: 0.0126117	total: 16.7s	remaining: 6.74s
178:	learn: 0.0125164	total: 16.8s	remaining: 6.66s
179:	learn: 0.0123945	total: 16.9s	remaining: 6.58s
180:	learn: 0.0123312	total: 17.1s	remaining: 6.5s
181:	learn: 0.0122340	total: 17.2s	remaining: 6.42s
182:	learn: 0.0121093	total: 17.3s	remaining: 6.35s
183:	learn: 0.0119710	total: 17.5s	remaining: 6.27s
184:	learn: 0.0118226	total: 17.6s	remaining: 6.19s
185:	learn: 0.0117675	total: 17.8s	remaining: 6.11s
186:	learn: 0.0117247	total: 17.9s	remaining: 6.03s
187:	learn: 0.0116074	total: 18.1s	remaining: 5.96s
188:	learn: 0.0115379	total: 18.2s	remaining: 5.88s
189:	learn: 0.0114575	total: 18.4s	remaining: 5.8s
190:	learn: 0.0113642	total: 18.5s	remaining: 5.71s
191:	learn: 0.0112294	total: 18.6s	remaining: 5.63s
192:	learn: 0.0111521	total: 18.8s	remaining: 5.54s
193:	learn: 0.0110363	total: 18.9s	remaining: 5.46s
194:	learn: 0.0109435	total: 19s	remaining: 5.37s
195:	learn: 0.0108202	total: 19.2s	remaining: 5.29s
196:	learn: 0.0107856	total: 19.3s	remaining: 5.19s
197:	learn: 0.0107027	total: 19.4s	remaining: 5.11s
198:	learn: 0.0106163	total: 19.6s	remaining: 5.02s
199:	learn: 0.0104954	total: 19.7s	remaining: 4.93s
200:	learn: 0.0104106	total: 19.9s	remaining: 4.84s
201:	learn: 0.0103738	total: 20s	remaining: 4.76s
202:	learn: 0.0103272	total: 20.2s	remaining: 4.67s
203:	learn: 0.0102559	total: 20.3s	remaining: 4.58s
204:	learn: 0.0102030	total: 20.4s	remaining: 4.49s
205:	learn: 0.0101267	total: 20.6s	remaining: 4.39s
206:	learn: 0.0100539	total: 20.7s	remaining: 4.3s
207:	learn: 0.0100080	total: 20.8s	remaining: 4.21s
208:	learn: 0.0099613	total: 21s	remaining: 4.12s
209:	learn: 0.0098672	total: 21.1s	remaining: 4.03s
210:	learn: 0.0098490	total: 21.2s	remaining: 3.92s
211:	learn: 0.0097876	total: 21.3s	remaining: 3.83s
212:	learn: 0.0097304	total: 21.4s	remaining: 3.72s
213:	learn: 0.0095669	total: 21.5s	remaining: 3.62s
214:	learn: 0.0094498	total: 21.6s	remaining: 3.52s
215:	learn: 0.0093764	total: 21.7s	remaining: 3.42s
216:	learn: 0.0093055	total: 21.8s	remaining: 3.31s
217:	learn: 0.0092837	total: 21.9s	remaining: 3.21s
218:	learn: 0.0092529	total: 21.9s	remaining: 3.1s
219:	learn: 0.0091255	total: 22s	remaining: 3s
220:	learn: 0.0090957	total: 22.1s	remaining: 2.9s
221:	learn: 0.0090498	total: 22.1s	remaining: 2.79s
222:	learn: 0.0090136	total: 22.2s	remaining: 2.69s
223:	learn: 0.0089768	total: 22.3s	remaining: 2.59s
224:	learn: 0.0089156	total: 22.4s	remaining: 2.48s
225:	learn: 0.0088898	total: 22.4s	remaining: 2.38s
226:	learn: 0.0088643	total: 22.5s	remaining: 2.28s
227:	learn: 0.0088306	total: 22.6s	remaining: 2.18s
228:	learn: 0.0087992	total: 22.7s	remaining: 2.08s
229:	learn: 0.0087467	total: 22.7s	remaining: 1.98s
230:	learn: 0.0086801	total: 22.8s	remaining: 1.88s
231:	learn: 0.0086438	total: 22.9s	remaining: 1.77s
232:	learn: 0.0085894	total: 23s	remaining: 1.68s
233:	learn: 0.0085411	total: 23s	remaining: 1.57s
234:	learn: 0.0084855	total: 23.1s	remaining: 1.47s
235:	learn: 0.0084710	total: 23.2s	remaining: 1.37s
236:	learn: 0.0083971	total: 23.2s	remaining: 1.27s
237:	learn: 0.0083103	total: 23.3s	remaining: 1.18s
238:	learn: 0.0082648	total: 23.4s	remaining: 1.07s
239:	learn: 0.0081938	total: 23.5s	remaining: 977ms
240:	learn: 0.0080906	total: 23.5s	remaining: 879ms
241:	learn: 0.0080729	total: 23.6s	remaining: 781ms
242:	learn: 0.0080351	total: 23.7s	remaining: 684ms
243:	learn: 0.0079777	total: 23.8s	remaining: 585ms
244:	learn: 0.0079777	total: 23.8s	remaining: 487ms
245:	learn: 0.0079776	total: 23.9s	remaining: 389ms
246:	learn: 0.0079776	total: 24s	remaining: 291ms
247:	learn: 0.0079775	total: 24s	remaining: 194ms
248:	learn: 0.0079511	total: 24.1s	remaining: 96.7ms
249:	learn: 0.0079511	total: 24.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.35
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.67
 - F1-Score_Train: 99.67
 - Precision_Test: 17.88
 - Recall_Test: 89.68
 - AUPRC_Test: 77.49
 - Accuracy_Test: 99.29
 - F1-Score_Test: 29.82
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 250
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.04
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5260685	total: 66.1ms	remaining: 16.5s
1:	learn: 0.4030021	total: 135ms	remaining: 16.8s
2:	learn: 0.3258341	total: 202ms	remaining: 16.6s
3:	learn: 0.2719836	total: 274ms	remaining: 16.9s
4:	learn: 0.2339878	total: 351ms	remaining: 17.2s
5:	learn: 0.1988623	total: 420ms	remaining: 17.1s
6:	learn: 0.1792070	total: 517ms	remaining: 18s
7:	learn: 0.1589249	total: 583ms	remaining: 17.6s
8:	learn: 0.1453418	total: 649ms	remaining: 17.4s
9:	learn: 0.1354685	total: 712ms	remaining: 17.1s
10:	learn: 0.1291748	total: 794ms	remaining: 17.2s
11:	learn: 0.1163772	total: 865ms	remaining: 17.2s
12:	learn: 0.1107879	total: 936ms	remaining: 17.1s
13:	learn: 0.1036318	total: 1.03s	remaining: 17.4s
14:	learn: 0.0968454	total: 1.1s	remaining: 17.3s
15:	learn: 0.0914500	total: 1.18s	remaining: 17.3s
16:	learn: 0.0885215	total: 1.27s	remaining: 17.5s
17:	learn: 0.0864099	total: 1.34s	remaining: 17.3s
18:	learn: 0.0841314	total: 1.41s	remaining: 17.2s
19:	learn: 0.0807523	total: 1.51s	remaining: 17.4s
20:	learn: 0.0773443	total: 1.59s	remaining: 17.3s
21:	learn: 0.0750982	total: 1.67s	remaining: 17.3s
22:	learn: 0.0738343	total: 1.75s	remaining: 17.3s
23:	learn: 0.0718427	total: 1.82s	remaining: 17.1s
24:	learn: 0.0701320	total: 1.89s	remaining: 17s
25:	learn: 0.0690091	total: 1.96s	remaining: 16.9s
26:	learn: 0.0673517	total: 2.04s	remaining: 16.9s
27:	learn: 0.0656368	total: 2.11s	remaining: 16.7s
28:	learn: 0.0647181	total: 2.19s	remaining: 16.7s
29:	learn: 0.0626819	total: 2.28s	remaining: 16.7s
30:	learn: 0.0609481	total: 2.35s	remaining: 16.6s
31:	learn: 0.0591527	total: 2.44s	remaining: 16.6s
32:	learn: 0.0578442	total: 2.52s	remaining: 16.6s
33:	learn: 0.0571189	total: 2.6s	remaining: 16.5s
34:	learn: 0.0557960	total: 2.68s	remaining: 16.5s
35:	learn: 0.0547855	total: 2.75s	remaining: 16.4s
36:	learn: 0.0536083	total: 2.82s	remaining: 16.3s
37:	learn: 0.0526344	total: 2.89s	remaining: 16.1s
38:	learn: 0.0517534	total: 2.96s	remaining: 16s
39:	learn: 0.0509013	total: 3.04s	remaining: 16s
40:	learn: 0.0498622	total: 3.12s	remaining: 15.9s
41:	learn: 0.0489616	total: 3.2s	remaining: 15.8s
42:	learn: 0.0479905	total: 3.27s	remaining: 15.8s
43:	learn: 0.0472369	total: 3.35s	remaining: 15.7s
44:	learn: 0.0464844	total: 3.42s	remaining: 15.6s
45:	learn: 0.0459896	total: 3.48s	remaining: 15.4s
46:	learn: 0.0451018	total: 3.58s	remaining: 15.5s
47:	learn: 0.0443589	total: 3.65s	remaining: 15.4s
48:	learn: 0.0439694	total: 3.72s	remaining: 15.3s
49:	learn: 0.0432382	total: 3.81s	remaining: 15.2s
50:	learn: 0.0425239	total: 3.89s	remaining: 15.2s
51:	learn: 0.0416509	total: 3.96s	remaining: 15.1s
52:	learn: 0.0410770	total: 4.05s	remaining: 15.1s
53:	learn: 0.0404499	total: 4.12s	remaining: 15s
54:	learn: 0.0400621	total: 4.19s	remaining: 14.8s
55:	learn: 0.0391670	total: 4.26s	remaining: 14.8s
56:	learn: 0.0385819	total: 4.34s	remaining: 14.7s
57:	learn: 0.0379397	total: 4.41s	remaining: 14.6s
58:	learn: 0.0375252	total: 4.5s	remaining: 14.6s
59:	learn: 0.0371906	total: 4.58s	remaining: 14.5s
60:	learn: 0.0366046	total: 4.65s	remaining: 14.4s
61:	learn: 0.0358968	total: 4.73s	remaining: 14.4s
62:	learn: 0.0353495	total: 4.8s	remaining: 14.2s
63:	learn: 0.0349704	total: 4.87s	remaining: 14.2s
64:	learn: 0.0346611	total: 4.94s	remaining: 14.1s
65:	learn: 0.0342238	total: 5.01s	remaining: 14s
66:	learn: 0.0337947	total: 5.12s	remaining: 14s
67:	learn: 0.0332496	total: 5.24s	remaining: 14s
68:	learn: 0.0329450	total: 5.35s	remaining: 14s
69:	learn: 0.0324447	total: 5.48s	remaining: 14.1s
70:	learn: 0.0321368	total: 5.63s	remaining: 14.2s
71:	learn: 0.0316079	total: 5.76s	remaining: 14.2s
72:	learn: 0.0312470	total: 5.9s	remaining: 14.3s
73:	learn: 0.0307396	total: 6.05s	remaining: 14.4s
74:	learn: 0.0303377	total: 6.18s	remaining: 14.4s
75:	learn: 0.0300355	total: 6.33s	remaining: 14.5s
76:	learn: 0.0298675	total: 6.46s	remaining: 14.5s
77:	learn: 0.0293420	total: 6.62s	remaining: 14.6s
78:	learn: 0.0289647	total: 6.77s	remaining: 14.6s
79:	learn: 0.0285523	total: 6.9s	remaining: 14.7s
80:	learn: 0.0280979	total: 7.02s	remaining: 14.7s
81:	learn: 0.0276252	total: 7.17s	remaining: 14.7s
82:	learn: 0.0274529	total: 7.29s	remaining: 14.7s
83:	learn: 0.0271621	total: 7.45s	remaining: 14.7s
84:	learn: 0.0269175	total: 7.58s	remaining: 14.7s
85:	learn: 0.0267389	total: 7.74s	remaining: 14.8s
86:	learn: 0.0263575	total: 7.88s	remaining: 14.8s
87:	learn: 0.0260465	total: 8.04s	remaining: 14.8s
88:	learn: 0.0256130	total: 8.18s	remaining: 14.8s
89:	learn: 0.0253193	total: 8.33s	remaining: 14.8s
90:	learn: 0.0250721	total: 8.48s	remaining: 14.8s
91:	learn: 0.0247172	total: 8.62s	remaining: 14.8s
92:	learn: 0.0244276	total: 8.76s	remaining: 14.8s
93:	learn: 0.0240981	total: 8.9s	remaining: 14.8s
94:	learn: 0.0238193	total: 9.04s	remaining: 14.7s
95:	learn: 0.0234414	total: 9.19s	remaining: 14.7s
96:	learn: 0.0232098	total: 9.33s	remaining: 14.7s
97:	learn: 0.0228992	total: 9.46s	remaining: 14.7s
98:	learn: 0.0227293	total: 9.6s	remaining: 14.6s
99:	learn: 0.0225612	total: 9.75s	remaining: 14.6s
100:	learn: 0.0223335	total: 9.89s	remaining: 14.6s
101:	learn: 0.0219810	total: 10s	remaining: 14.6s
102:	learn: 0.0217174	total: 10.2s	remaining: 14.5s
103:	learn: 0.0214091	total: 10.3s	remaining: 14.5s
104:	learn: 0.0211701	total: 10.4s	remaining: 14.4s
105:	learn: 0.0209138	total: 10.6s	remaining: 14.4s
106:	learn: 0.0206039	total: 10.7s	remaining: 14.3s
107:	learn: 0.0204203	total: 10.8s	remaining: 14.2s
108:	learn: 0.0202999	total: 10.9s	remaining: 14.1s
109:	learn: 0.0199902	total: 11s	remaining: 14s
110:	learn: 0.0198456	total: 11s	remaining: 13.8s
111:	learn: 0.0196320	total: 11.1s	remaining: 13.7s
112:	learn: 0.0193540	total: 11.2s	remaining: 13.6s
113:	learn: 0.0191148	total: 11.3s	remaining: 13.5s
114:	learn: 0.0189613	total: 11.4s	remaining: 13.3s
115:	learn: 0.0188146	total: 11.4s	remaining: 13.2s
116:	learn: 0.0185949	total: 11.5s	remaining: 13.1s
117:	learn: 0.0183072	total: 11.6s	remaining: 13s
118:	learn: 0.0181127	total: 11.6s	remaining: 12.8s
119:	learn: 0.0178710	total: 11.7s	remaining: 12.7s
120:	learn: 0.0177292	total: 11.8s	remaining: 12.6s
121:	learn: 0.0174747	total: 11.9s	remaining: 12.5s
122:	learn: 0.0173929	total: 12s	remaining: 12.4s
123:	learn: 0.0171467	total: 12.1s	remaining: 12.2s
124:	learn: 0.0169595	total: 12.1s	remaining: 12.1s
125:	learn: 0.0167612	total: 12.2s	remaining: 12s
126:	learn: 0.0166026	total: 12.3s	remaining: 11.9s
127:	learn: 0.0165178	total: 12.4s	remaining: 11.8s
128:	learn: 0.0163712	total: 12.4s	remaining: 11.7s
129:	learn: 0.0161113	total: 12.5s	remaining: 11.5s
130:	learn: 0.0159406	total: 12.6s	remaining: 11.4s
131:	learn: 0.0157771	total: 12.7s	remaining: 11.3s
132:	learn: 0.0156306	total: 12.7s	remaining: 11.2s
133:	learn: 0.0154520	total: 12.8s	remaining: 11.1s
134:	learn: 0.0152513	total: 12.9s	remaining: 11s
135:	learn: 0.0150938	total: 13s	remaining: 10.9s
136:	learn: 0.0149262	total: 13s	remaining: 10.8s
137:	learn: 0.0147728	total: 13.1s	remaining: 10.6s
138:	learn: 0.0146786	total: 13.2s	remaining: 10.5s
139:	learn: 0.0145587	total: 13.3s	remaining: 10.4s
140:	learn: 0.0144584	total: 13.4s	remaining: 10.3s
141:	learn: 0.0142794	total: 13.4s	remaining: 10.2s
142:	learn: 0.0141586	total: 13.5s	remaining: 10.1s
143:	learn: 0.0140168	total: 13.6s	remaining: 10s
144:	learn: 0.0139720	total: 13.7s	remaining: 9.88s
145:	learn: 0.0138972	total: 13.7s	remaining: 9.78s
146:	learn: 0.0138121	total: 13.8s	remaining: 9.67s
147:	learn: 0.0136996	total: 13.9s	remaining: 9.56s
148:	learn: 0.0136274	total: 14s	remaining: 9.46s
149:	learn: 0.0135110	total: 14s	remaining: 9.36s
150:	learn: 0.0133459	total: 14.1s	remaining: 9.25s
151:	learn: 0.0132206	total: 14.2s	remaining: 9.14s
152:	learn: 0.0131162	total: 14.3s	remaining: 9.04s
153:	learn: 0.0130272	total: 14.3s	remaining: 8.94s
154:	learn: 0.0129542	total: 14.4s	remaining: 8.83s
155:	learn: 0.0128668	total: 14.5s	remaining: 8.73s
156:	learn: 0.0127680	total: 14.6s	remaining: 8.62s
157:	learn: 0.0127218	total: 14.6s	remaining: 8.51s
158:	learn: 0.0125607	total: 14.7s	remaining: 8.4s
159:	learn: 0.0124590	total: 14.8s	remaining: 8.31s
160:	learn: 0.0123480	total: 14.8s	remaining: 8.2s
161:	learn: 0.0122138	total: 14.9s	remaining: 8.11s
162:	learn: 0.0121487	total: 15s	remaining: 8.01s
163:	learn: 0.0120050	total: 15.1s	remaining: 7.9s
164:	learn: 0.0119050	total: 15.2s	remaining: 7.81s
165:	learn: 0.0118420	total: 15.2s	remaining: 7.71s
166:	learn: 0.0117873	total: 15.3s	remaining: 7.6s
167:	learn: 0.0117284	total: 15.4s	remaining: 7.51s
168:	learn: 0.0116448	total: 15.4s	remaining: 7.4s
169:	learn: 0.0115440	total: 15.5s	remaining: 7.3s
170:	learn: 0.0114564	total: 15.6s	remaining: 7.21s
171:	learn: 0.0113741	total: 15.7s	remaining: 7.11s
172:	learn: 0.0113184	total: 15.7s	remaining: 7.01s
173:	learn: 0.0112148	total: 15.8s	remaining: 6.91s
174:	learn: 0.0111158	total: 15.9s	remaining: 6.82s
175:	learn: 0.0109933	total: 16s	remaining: 6.71s
176:	learn: 0.0109083	total: 16.1s	remaining: 6.63s
177:	learn: 0.0108602	total: 16.1s	remaining: 6.53s
178:	learn: 0.0107900	total: 16.2s	remaining: 6.43s
179:	learn: 0.0107039	total: 16.3s	remaining: 6.34s
180:	learn: 0.0105997	total: 16.4s	remaining: 6.24s
181:	learn: 0.0105119	total: 16.4s	remaining: 6.14s
182:	learn: 0.0104608	total: 16.5s	remaining: 6.04s
183:	learn: 0.0104083	total: 16.6s	remaining: 5.95s
184:	learn: 0.0103170	total: 16.7s	remaining: 5.85s
185:	learn: 0.0101779	total: 16.7s	remaining: 5.76s
186:	learn: 0.0100999	total: 16.8s	remaining: 5.66s
187:	learn: 0.0100077	total: 16.9s	remaining: 5.57s
188:	learn: 0.0099568	total: 16.9s	remaining: 5.47s
189:	learn: 0.0098965	total: 17s	remaining: 5.38s
190:	learn: 0.0098172	total: 17.1s	remaining: 5.29s
191:	learn: 0.0097016	total: 17.2s	remaining: 5.19s
192:	learn: 0.0096411	total: 17.3s	remaining: 5.1s
193:	learn: 0.0095632	total: 17.3s	remaining: 5s
194:	learn: 0.0095105	total: 17.4s	remaining: 4.91s
195:	learn: 0.0094484	total: 17.5s	remaining: 4.81s
196:	learn: 0.0093405	total: 17.5s	remaining: 4.72s
197:	learn: 0.0092372	total: 17.6s	remaining: 4.63s
198:	learn: 0.0091207	total: 17.7s	remaining: 4.54s
199:	learn: 0.0090774	total: 17.8s	remaining: 4.45s
200:	learn: 0.0089962	total: 17.9s	remaining: 4.35s
201:	learn: 0.0089083	total: 17.9s	remaining: 4.26s
202:	learn: 0.0088232	total: 18s	remaining: 4.17s
203:	learn: 0.0087524	total: 18.1s	remaining: 4.08s
204:	learn: 0.0087089	total: 18.2s	remaining: 3.99s
205:	learn: 0.0086633	total: 18.2s	remaining: 3.9s
206:	learn: 0.0086027	total: 18.3s	remaining: 3.8s
207:	learn: 0.0085516	total: 18.4s	remaining: 3.71s
208:	learn: 0.0084852	total: 18.5s	remaining: 3.62s
209:	learn: 0.0084028	total: 18.5s	remaining: 3.53s
210:	learn: 0.0083550	total: 18.6s	remaining: 3.44s
211:	learn: 0.0083045	total: 18.7s	remaining: 3.35s
212:	learn: 0.0082281	total: 18.8s	remaining: 3.26s
213:	learn: 0.0081769	total: 18.9s	remaining: 3.17s
214:	learn: 0.0081278	total: 18.9s	remaining: 3.08s
215:	learn: 0.0080674	total: 19s	remaining: 2.99s
216:	learn: 0.0079672	total: 19.1s	remaining: 2.9s
217:	learn: 0.0079327	total: 19.2s	remaining: 2.82s
218:	learn: 0.0078812	total: 19.3s	remaining: 2.73s
219:	learn: 0.0077846	total: 19.3s	remaining: 2.64s
220:	learn: 0.0077248	total: 19.4s	remaining: 2.55s
221:	learn: 0.0076988	total: 19.5s	remaining: 2.46s
222:	learn: 0.0076761	total: 19.5s	remaining: 2.37s
223:	learn: 0.0076349	total: 19.6s	remaining: 2.28s
224:	learn: 0.0075883	total: 19.7s	remaining: 2.19s
225:	learn: 0.0075146	total: 19.8s	remaining: 2.1s
226:	learn: 0.0074864	total: 19.8s	remaining: 2.01s
227:	learn: 0.0074684	total: 19.9s	remaining: 1.92s
228:	learn: 0.0074375	total: 20s	remaining: 1.83s
229:	learn: 0.0073652	total: 20s	remaining: 1.74s
230:	learn: 0.0073246	total: 20.1s	remaining: 1.66s
231:	learn: 0.0072469	total: 20.2s	remaining: 1.57s
232:	learn: 0.0072366	total: 20.3s	remaining: 1.48s
233:	learn: 0.0071963	total: 20.3s	remaining: 1.39s
234:	learn: 0.0071430	total: 20.4s	remaining: 1.3s
235:	learn: 0.0071163	total: 20.5s	remaining: 1.21s
236:	learn: 0.0070648	total: 20.6s	remaining: 1.13s
237:	learn: 0.0070274	total: 20.6s	remaining: 1.04s
238:	learn: 0.0069725	total: 20.7s	remaining: 954ms
239:	learn: 0.0069187	total: 20.8s	remaining: 869ms
240:	learn: 0.0068543	total: 21s	remaining: 783ms
241:	learn: 0.0067984	total: 21.1s	remaining: 698ms
242:	learn: 0.0067528	total: 21.3s	remaining: 612ms
243:	learn: 0.0067301	total: 21.4s	remaining: 526ms
244:	learn: 0.0067129	total: 21.5s	remaining: 439ms
245:	learn: 0.0066791	total: 21.7s	remaining: 352ms
246:	learn: 0.0066319	total: 21.8s	remaining: 264ms
247:	learn: 0.0065483	total: 21.9s	remaining: 177ms
248:	learn: 0.0064909	total: 22.1s	remaining: 88.6ms
249:	learn: 0.0064633	total: 22.2s	remaining: 0us
[I 2024-12-19 14:31:20,266] Trial 20 finished with value: 75.11058411780488 and parameters: {'learning_rate': 0.08045994957415391, 'max_depth': 4, 'n_estimators': 250, 'scale_pos_weight': 5.035013283171664}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.38
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.69
 - F1-Score_Train: 99.69
 - Precision_Test: 17.13
 - Recall_Test: 87.30
 - AUPRC_Test: 71.44
 - Accuracy_Test: 99.27
 - F1-Score_Test: 28.65
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 250
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.04
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 75.1106

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5159139	total: 153ms	remaining: 40.6s
1:	learn: 0.3977486	total: 301ms	remaining: 39.7s
2:	learn: 0.3092914	total: 480ms	remaining: 42.1s
3:	learn: 0.2349535	total: 643ms	remaining: 42.1s
4:	learn: 0.1861807	total: 800ms	remaining: 41.8s
5:	learn: 0.1475838	total: 905ms	remaining: 39.2s
6:	learn: 0.1281250	total: 975ms	remaining: 36.1s
7:	learn: 0.1094297	total: 1.07s	remaining: 34.7s
8:	learn: 0.1008056	total: 1.15s	remaining: 32.8s
9:	learn: 0.0913643	total: 1.23s	remaining: 31.5s
10:	learn: 0.0817660	total: 1.32s	remaining: 30.6s
11:	learn: 0.0758253	total: 1.4s	remaining: 29.6s
12:	learn: 0.0684774	total: 1.49s	remaining: 29s
13:	learn: 0.0636999	total: 1.58s	remaining: 28.4s
14:	learn: 0.0589117	total: 1.66s	remaining: 27.8s
15:	learn: 0.0557532	total: 1.74s	remaining: 27.2s
16:	learn: 0.0527431	total: 1.85s	remaining: 27.2s
17:	learn: 0.0506144	total: 1.94s	remaining: 26.7s
18:	learn: 0.0484851	total: 2.02s	remaining: 26.3s
19:	learn: 0.0470198	total: 2.12s	remaining: 26.1s
20:	learn: 0.0451236	total: 2.19s	remaining: 25.6s
21:	learn: 0.0431301	total: 2.27s	remaining: 25.2s
22:	learn: 0.0417887	total: 2.36s	remaining: 24.9s
23:	learn: 0.0402311	total: 2.44s	remaining: 24.6s
24:	learn: 0.0390209	total: 2.53s	remaining: 24.4s
25:	learn: 0.0381197	total: 2.63s	remaining: 24.3s
26:	learn: 0.0373278	total: 2.7s	remaining: 23.9s
27:	learn: 0.0360794	total: 2.78s	remaining: 23.7s
28:	learn: 0.0352582	total: 2.9s	remaining: 23.7s
29:	learn: 0.0343433	total: 2.98s	remaining: 23.4s
30:	learn: 0.0334115	total: 3.07s	remaining: 23.3s
31:	learn: 0.0323387	total: 3.17s	remaining: 23.2s
32:	learn: 0.0316148	total: 3.25s	remaining: 22.9s
33:	learn: 0.0308403	total: 3.33s	remaining: 22.7s
34:	learn: 0.0299760	total: 3.42s	remaining: 22.6s
35:	learn: 0.0292537	total: 3.51s	remaining: 22.4s
36:	learn: 0.0285936	total: 3.58s	remaining: 22.2s
37:	learn: 0.0274552	total: 3.68s	remaining: 22.1s
38:	learn: 0.0268631	total: 3.76s	remaining: 21.9s
39:	learn: 0.0262314	total: 3.83s	remaining: 21.7s
40:	learn: 0.0256975	total: 3.95s	remaining: 21.7s
41:	learn: 0.0251833	total: 4.02s	remaining: 21.5s
42:	learn: 0.0244916	total: 4.1s	remaining: 21.3s
43:	learn: 0.0239302	total: 4.2s	remaining: 21.2s
44:	learn: 0.0235030	total: 4.28s	remaining: 21s
45:	learn: 0.0227574	total: 4.36s	remaining: 20.9s
46:	learn: 0.0224355	total: 4.45s	remaining: 20.7s
47:	learn: 0.0220127	total: 4.53s	remaining: 20.6s
48:	learn: 0.0213244	total: 4.61s	remaining: 20.4s
49:	learn: 0.0208261	total: 4.7s	remaining: 20.3s
50:	learn: 0.0204207	total: 4.77s	remaining: 20.1s
51:	learn: 0.0200187	total: 4.86s	remaining: 20s
52:	learn: 0.0196929	total: 4.97s	remaining: 20s
53:	learn: 0.0192421	total: 5.05s	remaining: 19.8s
54:	learn: 0.0187309	total: 5.13s	remaining: 19.7s
55:	learn: 0.0184932	total: 5.23s	remaining: 19.6s
56:	learn: 0.0182125	total: 5.31s	remaining: 19.5s
57:	learn: 0.0178911	total: 5.38s	remaining: 19.3s
58:	learn: 0.0173647	total: 5.48s	remaining: 19.2s
59:	learn: 0.0169423	total: 5.57s	remaining: 19.1s
60:	learn: 0.0166732	total: 5.64s	remaining: 19s
61:	learn: 0.0164683	total: 5.74s	remaining: 18.9s
62:	learn: 0.0161936	total: 5.81s	remaining: 18.7s
63:	learn: 0.0159421	total: 5.88s	remaining: 18.6s
64:	learn: 0.0155006	total: 6.01s	remaining: 18.6s
65:	learn: 0.0151821	total: 6.09s	remaining: 18.4s
66:	learn: 0.0148532	total: 6.17s	remaining: 18.3s
67:	learn: 0.0146322	total: 6.26s	remaining: 18.2s
68:	learn: 0.0142994	total: 6.34s	remaining: 18.1s
69:	learn: 0.0140762	total: 6.42s	remaining: 18s
70:	learn: 0.0138723	total: 6.51s	remaining: 17.9s
71:	learn: 0.0136435	total: 6.59s	remaining: 17.8s
72:	learn: 0.0134082	total: 6.67s	remaining: 17.6s
73:	learn: 0.0130902	total: 6.77s	remaining: 17.6s
74:	learn: 0.0128757	total: 6.85s	remaining: 17.5s
75:	learn: 0.0126587	total: 6.93s	remaining: 17.3s
76:	learn: 0.0125259	total: 7.04s	remaining: 17.3s
77:	learn: 0.0123231	total: 7.12s	remaining: 17.2s
78:	learn: 0.0121729	total: 7.21s	remaining: 17.1s
79:	learn: 0.0120088	total: 7.29s	remaining: 17s
80:	learn: 0.0118610	total: 7.37s	remaining: 16.8s
81:	learn: 0.0117156	total: 7.46s	remaining: 16.7s
82:	learn: 0.0115693	total: 7.56s	remaining: 16.7s
83:	learn: 0.0114856	total: 7.63s	remaining: 16.5s
84:	learn: 0.0113571	total: 7.71s	remaining: 16.4s
85:	learn: 0.0112091	total: 7.8s	remaining: 16.3s
86:	learn: 0.0110144	total: 7.88s	remaining: 16.2s
87:	learn: 0.0108812	total: 7.96s	remaining: 16.1s
88:	learn: 0.0107443	total: 8.07s	remaining: 16.1s
89:	learn: 0.0105502	total: 8.16s	remaining: 16s
90:	learn: 0.0104321	total: 8.23s	remaining: 15.8s
91:	learn: 0.0102902	total: 8.33s	remaining: 15.8s
92:	learn: 0.0101669	total: 8.41s	remaining: 15.6s
93:	learn: 0.0100546	total: 8.49s	remaining: 15.5s
94:	learn: 0.0098744	total: 8.59s	remaining: 15.5s
95:	learn: 0.0097891	total: 8.67s	remaining: 15.3s
96:	learn: 0.0096413	total: 8.74s	remaining: 15.2s
97:	learn: 0.0095226	total: 8.84s	remaining: 15.2s
98:	learn: 0.0093745	total: 8.93s	remaining: 15.1s
99:	learn: 0.0092641	total: 9.01s	remaining: 15s
100:	learn: 0.0090905	total: 9.12s	remaining: 14.9s
101:	learn: 0.0089842	total: 9.2s	remaining: 14.8s
102:	learn: 0.0088153	total: 9.29s	remaining: 14.7s
103:	learn: 0.0086947	total: 9.38s	remaining: 14.6s
104:	learn: 0.0086550	total: 9.45s	remaining: 14.5s
105:	learn: 0.0085163	total: 9.53s	remaining: 14.4s
106:	learn: 0.0084541	total: 9.62s	remaining: 14.3s
107:	learn: 0.0083215	total: 9.7s	remaining: 14.2s
108:	learn: 0.0082311	total: 9.78s	remaining: 14.1s
109:	learn: 0.0081522	total: 9.86s	remaining: 14s
110:	learn: 0.0080257	total: 9.94s	remaining: 13.9s
111:	learn: 0.0079478	total: 10s	remaining: 13.8s
112:	learn: 0.0077890	total: 10.1s	remaining: 13.7s
113:	learn: 0.0076666	total: 10.2s	remaining: 13.6s
114:	learn: 0.0075332	total: 10.3s	remaining: 13.5s
115:	learn: 0.0074490	total: 10.4s	remaining: 13.4s
116:	learn: 0.0073863	total: 10.5s	remaining: 13.3s
117:	learn: 0.0072439	total: 10.6s	remaining: 13.3s
118:	learn: 0.0071897	total: 10.7s	remaining: 13.2s
119:	learn: 0.0070863	total: 10.7s	remaining: 13.1s
120:	learn: 0.0070308	total: 10.8s	remaining: 13s
121:	learn: 0.0069771	total: 11s	remaining: 12.9s
122:	learn: 0.0069100	total: 11.1s	remaining: 12.9s
123:	learn: 0.0068657	total: 11.3s	remaining: 12.9s
124:	learn: 0.0067438	total: 11.4s	remaining: 12.9s
125:	learn: 0.0066972	total: 11.6s	remaining: 12.9s
126:	learn: 0.0066500	total: 11.7s	remaining: 12.8s
127:	learn: 0.0065935	total: 11.9s	remaining: 12.8s
128:	learn: 0.0065090	total: 12s	remaining: 12.8s
129:	learn: 0.0064307	total: 12.2s	remaining: 12.8s
130:	learn: 0.0063793	total: 12.4s	remaining: 12.8s
131:	learn: 0.0063448	total: 12.5s	remaining: 12.7s
132:	learn: 0.0063033	total: 12.7s	remaining: 12.7s
133:	learn: 0.0062700	total: 12.8s	remaining: 12.6s
134:	learn: 0.0062251	total: 13s	remaining: 12.6s
135:	learn: 0.0061334	total: 13.2s	remaining: 12.6s
136:	learn: 0.0060891	total: 13.3s	remaining: 12.5s
137:	learn: 0.0060087	total: 13.5s	remaining: 12.5s
138:	learn: 0.0059320	total: 13.7s	remaining: 12.5s
139:	learn: 0.0058650	total: 13.8s	remaining: 12.4s
140:	learn: 0.0058023	total: 14s	remaining: 12.4s
141:	learn: 0.0057282	total: 14.1s	remaining: 12.3s
142:	learn: 0.0056819	total: 14.3s	remaining: 12.3s
143:	learn: 0.0056819	total: 14.4s	remaining: 12.2s
144:	learn: 0.0056539	total: 14.6s	remaining: 12.1s
145:	learn: 0.0056163	total: 14.7s	remaining: 12.1s
146:	learn: 0.0055683	total: 14.9s	remaining: 12s
147:	learn: 0.0055469	total: 15s	remaining: 12s
148:	learn: 0.0054802	total: 15.1s	remaining: 11.9s
149:	learn: 0.0054495	total: 15.3s	remaining: 11.8s
150:	learn: 0.0054237	total: 15.5s	remaining: 11.8s
151:	learn: 0.0053647	total: 15.6s	remaining: 11.7s
152:	learn: 0.0053090	total: 15.8s	remaining: 11.6s
153:	learn: 0.0052758	total: 15.9s	remaining: 11.6s
154:	learn: 0.0052467	total: 16s	remaining: 11.5s
155:	learn: 0.0051540	total: 16.1s	remaining: 11.4s
156:	learn: 0.0051233	total: 16.2s	remaining: 11.2s
157:	learn: 0.0050804	total: 16.3s	remaining: 11.1s
158:	learn: 0.0050245	total: 16.4s	remaining: 11s
159:	learn: 0.0049700	total: 16.5s	remaining: 10.9s
160:	learn: 0.0049553	total: 16.5s	remaining: 10.8s
161:	learn: 0.0049274	total: 16.6s	remaining: 10.7s
162:	learn: 0.0048787	total: 16.7s	remaining: 10.6s
163:	learn: 0.0048086	total: 16.8s	remaining: 10.4s
164:	learn: 0.0048001	total: 16.8s	remaining: 10.3s
165:	learn: 0.0047727	total: 16.9s	remaining: 10.2s
166:	learn: 0.0047440	total: 17s	remaining: 10.1s
167:	learn: 0.0047007	total: 17.1s	remaining: 9.97s
168:	learn: 0.0047007	total: 17.2s	remaining: 9.86s
169:	learn: 0.0046552	total: 17.3s	remaining: 9.74s
170:	learn: 0.0046039	total: 17.3s	remaining: 9.63s
171:	learn: 0.0045658	total: 17.4s	remaining: 9.53s
172:	learn: 0.0045554	total: 17.5s	remaining: 9.42s
173:	learn: 0.0045311	total: 17.6s	remaining: 9.3s
174:	learn: 0.0045087	total: 17.7s	remaining: 9.2s
175:	learn: 0.0044740	total: 17.8s	remaining: 9.08s
176:	learn: 0.0044272	total: 17.8s	remaining: 8.97s
177:	learn: 0.0043737	total: 17.9s	remaining: 8.87s
178:	learn: 0.0043410	total: 18s	remaining: 8.76s
179:	learn: 0.0043106	total: 18.1s	remaining: 8.64s
180:	learn: 0.0042875	total: 18.2s	remaining: 8.54s
181:	learn: 0.0042438	total: 18.3s	remaining: 8.43s
182:	learn: 0.0042189	total: 18.3s	remaining: 8.31s
183:	learn: 0.0042127	total: 18.4s	remaining: 8.21s
184:	learn: 0.0041632	total: 18.5s	remaining: 8.11s
185:	learn: 0.0041489	total: 18.6s	remaining: 7.99s
186:	learn: 0.0041183	total: 18.7s	remaining: 7.89s
187:	learn: 0.0040756	total: 18.8s	remaining: 7.78s
188:	learn: 0.0040554	total: 18.8s	remaining: 7.67s
189:	learn: 0.0040431	total: 18.9s	remaining: 7.57s
190:	learn: 0.0039825	total: 19s	remaining: 7.46s
191:	learn: 0.0039825	total: 19.1s	remaining: 7.35s
192:	learn: 0.0039629	total: 19.2s	remaining: 7.25s
193:	learn: 0.0039193	total: 19.2s	remaining: 7.14s
194:	learn: 0.0038531	total: 19.3s	remaining: 7.04s
195:	learn: 0.0038389	total: 19.4s	remaining: 6.94s
196:	learn: 0.0038151	total: 19.5s	remaining: 6.83s
197:	learn: 0.0037918	total: 19.6s	remaining: 6.73s
198:	learn: 0.0037918	total: 19.7s	remaining: 6.62s
199:	learn: 0.0037711	total: 19.8s	remaining: 6.52s
200:	learn: 0.0037711	total: 19.8s	remaining: 6.41s
201:	learn: 0.0037710	total: 19.9s	remaining: 6.3s
202:	learn: 0.0037710	total: 19.9s	remaining: 6.19s
203:	learn: 0.0037710	total: 20s	remaining: 6.08s
204:	learn: 0.0037557	total: 20.1s	remaining: 5.98s
205:	learn: 0.0037556	total: 20.2s	remaining: 5.87s
206:	learn: 0.0037556	total: 20.2s	remaining: 5.76s
207:	learn: 0.0037556	total: 20.3s	remaining: 5.66s
208:	learn: 0.0037556	total: 20.4s	remaining: 5.56s
209:	learn: 0.0037183	total: 20.5s	remaining: 5.46s
210:	learn: 0.0037039	total: 20.5s	remaining: 5.36s
211:	learn: 0.0036658	total: 20.6s	remaining: 5.26s
212:	learn: 0.0036658	total: 20.7s	remaining: 5.15s
213:	learn: 0.0036658	total: 20.8s	remaining: 5.05s
214:	learn: 0.0036658	total: 20.8s	remaining: 4.94s
215:	learn: 0.0036658	total: 20.9s	remaining: 4.84s
216:	learn: 0.0036658	total: 21s	remaining: 4.74s
217:	learn: 0.0036658	total: 21.1s	remaining: 4.64s
218:	learn: 0.0036657	total: 21.1s	remaining: 4.53s
219:	learn: 0.0036657	total: 21.2s	remaining: 4.43s
220:	learn: 0.0036657	total: 21.3s	remaining: 4.33s
221:	learn: 0.0036657	total: 21.3s	remaining: 4.23s
222:	learn: 0.0036657	total: 21.4s	remaining: 4.13s
223:	learn: 0.0036657	total: 21.5s	remaining: 4.03s
224:	learn: 0.0036657	total: 21.6s	remaining: 3.93s
225:	learn: 0.0036657	total: 21.7s	remaining: 3.83s
226:	learn: 0.0036657	total: 21.7s	remaining: 3.73s
227:	learn: 0.0036657	total: 21.8s	remaining: 3.63s
228:	learn: 0.0036657	total: 21.9s	remaining: 3.53s
229:	learn: 0.0036657	total: 21.9s	remaining: 3.43s
230:	learn: 0.0036657	total: 22s	remaining: 3.33s
231:	learn: 0.0036657	total: 22.1s	remaining: 3.23s
232:	learn: 0.0036657	total: 22.1s	remaining: 3.13s
233:	learn: 0.0036657	total: 22.2s	remaining: 3.04s
234:	learn: 0.0036657	total: 22.3s	remaining: 2.94s
235:	learn: 0.0036657	total: 22.4s	remaining: 2.84s
236:	learn: 0.0036657	total: 22.4s	remaining: 2.74s
237:	learn: 0.0036657	total: 22.5s	remaining: 2.65s
238:	learn: 0.0036657	total: 22.6s	remaining: 2.55s
239:	learn: 0.0036657	total: 22.6s	remaining: 2.45s
240:	learn: 0.0036657	total: 22.7s	remaining: 2.36s
241:	learn: 0.0036657	total: 22.8s	remaining: 2.26s
242:	learn: 0.0036657	total: 22.9s	remaining: 2.16s
243:	learn: 0.0036657	total: 22.9s	remaining: 2.07s
244:	learn: 0.0036657	total: 23s	remaining: 1.97s
245:	learn: 0.0036657	total: 23.1s	remaining: 1.88s
246:	learn: 0.0036657	total: 23.1s	remaining: 1.78s
247:	learn: 0.0036657	total: 23.2s	remaining: 1.69s
248:	learn: 0.0036657	total: 23.3s	remaining: 1.59s
249:	learn: 0.0036657	total: 23.4s	remaining: 1.5s
250:	learn: 0.0036657	total: 23.4s	remaining: 1.4s
251:	learn: 0.0036657	total: 23.5s	remaining: 1.31s
252:	learn: 0.0036657	total: 23.6s	remaining: 1.21s
253:	learn: 0.0036657	total: 23.7s	remaining: 1.12s
254:	learn: 0.0036657	total: 23.7s	remaining: 1.02s
255:	learn: 0.0036657	total: 23.8s	remaining: 930ms
256:	learn: 0.0036657	total: 23.9s	remaining: 836ms
257:	learn: 0.0036657	total: 23.9s	remaining: 742ms
258:	learn: 0.0036657	total: 24s	remaining: 649ms
259:	learn: 0.0036657	total: 24.1s	remaining: 556ms
260:	learn: 0.0036657	total: 24.2s	remaining: 463ms
261:	learn: 0.0036657	total: 24.2s	remaining: 370ms
262:	learn: 0.0036657	total: 24.3s	remaining: 277ms
263:	learn: 0.0036657	total: 24.4s	remaining: 185ms
264:	learn: 0.0036657	total: 24.5s	remaining: 92.3ms
265:	learn: 0.0036657	total: 24.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.51
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 21.07
 - Recall_Test: 87.30
 - AUPRC_Test: 77.50
 - Accuracy_Test: 99.43
 - F1-Score_Test: 33.95
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 266
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.41
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5321953	total: 151ms	remaining: 40s
1:	learn: 0.3972426	total: 300ms	remaining: 39.6s
2:	learn: 0.3295590	total: 460ms	remaining: 40.3s
3:	learn: 0.2674780	total: 610ms	remaining: 40s
4:	learn: 0.2281870	total: 756ms	remaining: 39.5s
5:	learn: 0.1942593	total: 909ms	remaining: 39.4s
6:	learn: 0.1661381	total: 1.09s	remaining: 40.3s
7:	learn: 0.1480700	total: 1.24s	remaining: 40s
8:	learn: 0.1310231	total: 1.39s	remaining: 39.7s
9:	learn: 0.1212022	total: 1.55s	remaining: 39.8s
10:	learn: 0.1130305	total: 1.72s	remaining: 39.9s
11:	learn: 0.1054203	total: 1.89s	remaining: 39.9s
12:	learn: 0.0998502	total: 2.05s	remaining: 39.9s
13:	learn: 0.0956048	total: 2.18s	remaining: 39.3s
14:	learn: 0.0899169	total: 2.36s	remaining: 39.5s
15:	learn: 0.0866183	total: 2.51s	remaining: 39.2s
16:	learn: 0.0823898	total: 2.69s	remaining: 39.4s
17:	learn: 0.0775751	total: 2.86s	remaining: 39.4s
18:	learn: 0.0728052	total: 3.02s	remaining: 39.2s
19:	learn: 0.0709867	total: 3.17s	remaining: 39s
20:	learn: 0.0689709	total: 3.33s	remaining: 38.9s
21:	learn: 0.0661590	total: 3.5s	remaining: 38.8s
22:	learn: 0.0638685	total: 3.65s	remaining: 38.6s
23:	learn: 0.0613504	total: 3.79s	remaining: 38.2s
24:	learn: 0.0592857	total: 3.95s	remaining: 38.1s
25:	learn: 0.0579495	total: 4.1s	remaining: 37.8s
26:	learn: 0.0564903	total: 4.25s	remaining: 37.6s
27:	learn: 0.0552035	total: 4.41s	remaining: 37.5s
28:	learn: 0.0534211	total: 4.57s	remaining: 37.4s
29:	learn: 0.0520875	total: 4.71s	remaining: 37.1s
30:	learn: 0.0500109	total: 4.87s	remaining: 36.9s
31:	learn: 0.0490282	total: 4.94s	remaining: 36.1s
32:	learn: 0.0476968	total: 5.02s	remaining: 35.5s
33:	learn: 0.0466919	total: 5.13s	remaining: 35s
34:	learn: 0.0455505	total: 5.21s	remaining: 34.4s
35:	learn: 0.0441022	total: 5.29s	remaining: 33.8s
36:	learn: 0.0431420	total: 5.41s	remaining: 33.5s
37:	learn: 0.0421230	total: 5.49s	remaining: 32.9s
38:	learn: 0.0407114	total: 5.57s	remaining: 32.4s
39:	learn: 0.0395504	total: 5.68s	remaining: 32.1s
40:	learn: 0.0385608	total: 5.76s	remaining: 31.6s
41:	learn: 0.0378177	total: 5.84s	remaining: 31.1s
42:	learn: 0.0372575	total: 5.93s	remaining: 30.7s
43:	learn: 0.0361112	total: 6.01s	remaining: 30.3s
44:	learn: 0.0351111	total: 6.11s	remaining: 30s
45:	learn: 0.0341217	total: 6.2s	remaining: 29.7s
46:	learn: 0.0333316	total: 6.29s	remaining: 29.3s
47:	learn: 0.0324876	total: 6.37s	remaining: 28.9s
48:	learn: 0.0319295	total: 6.46s	remaining: 28.6s
49:	learn: 0.0313315	total: 6.54s	remaining: 28.3s
50:	learn: 0.0308857	total: 6.62s	remaining: 27.9s
51:	learn: 0.0302329	total: 6.71s	remaining: 27.6s
52:	learn: 0.0295942	total: 6.8s	remaining: 27.3s
53:	learn: 0.0289225	total: 6.88s	remaining: 27s
54:	learn: 0.0285293	total: 6.97s	remaining: 26.8s
55:	learn: 0.0278265	total: 7.05s	remaining: 26.5s
56:	learn: 0.0275512	total: 7.14s	remaining: 26.2s
57:	learn: 0.0269801	total: 7.24s	remaining: 26s
58:	learn: 0.0266121	total: 7.32s	remaining: 25.7s
59:	learn: 0.0261565	total: 7.41s	remaining: 25.4s
60:	learn: 0.0255503	total: 7.51s	remaining: 25.2s
61:	learn: 0.0252800	total: 7.58s	remaining: 24.9s
62:	learn: 0.0248028	total: 7.66s	remaining: 24.7s
63:	learn: 0.0244389	total: 7.75s	remaining: 24.5s
64:	learn: 0.0239217	total: 7.84s	remaining: 24.2s
65:	learn: 0.0234756	total: 7.92s	remaining: 24s
66:	learn: 0.0230721	total: 8.02s	remaining: 23.8s
67:	learn: 0.0227608	total: 8.09s	remaining: 23.5s
68:	learn: 0.0223666	total: 8.18s	remaining: 23.4s
69:	learn: 0.0221505	total: 8.27s	remaining: 23.2s
70:	learn: 0.0218482	total: 8.34s	remaining: 22.9s
71:	learn: 0.0214526	total: 8.44s	remaining: 22.7s
72:	learn: 0.0211737	total: 8.52s	remaining: 22.5s
73:	learn: 0.0208546	total: 8.6s	remaining: 22.3s
74:	learn: 0.0206269	total: 8.68s	remaining: 22.1s
75:	learn: 0.0202984	total: 8.78s	remaining: 21.9s
76:	learn: 0.0200451	total: 8.85s	remaining: 21.7s
77:	learn: 0.0197524	total: 8.93s	remaining: 21.5s
78:	learn: 0.0195861	total: 9.01s	remaining: 21.3s
79:	learn: 0.0191753	total: 9.1s	remaining: 21.1s
80:	learn: 0.0188509	total: 9.18s	remaining: 21s
81:	learn: 0.0186035	total: 9.28s	remaining: 20.8s
82:	learn: 0.0183171	total: 9.36s	remaining: 20.6s
83:	learn: 0.0180469	total: 9.44s	remaining: 20.5s
84:	learn: 0.0178332	total: 9.54s	remaining: 20.3s
85:	learn: 0.0175659	total: 9.62s	remaining: 20.1s
86:	learn: 0.0174341	total: 9.69s	remaining: 19.9s
87:	learn: 0.0172695	total: 9.79s	remaining: 19.8s
88:	learn: 0.0170499	total: 9.86s	remaining: 19.6s
89:	learn: 0.0167922	total: 9.94s	remaining: 19.4s
90:	learn: 0.0165049	total: 10s	remaining: 19.3s
91:	learn: 0.0163084	total: 10.1s	remaining: 19.1s
92:	learn: 0.0161262	total: 10.2s	remaining: 19s
93:	learn: 0.0158261	total: 10.3s	remaining: 18.8s
94:	learn: 0.0155699	total: 10.4s	remaining: 18.7s
95:	learn: 0.0154896	total: 10.5s	remaining: 18.5s
96:	learn: 0.0153956	total: 10.5s	remaining: 18.4s
97:	learn: 0.0151982	total: 10.6s	remaining: 18.2s
98:	learn: 0.0150254	total: 10.7s	remaining: 18s
99:	learn: 0.0148395	total: 10.8s	remaining: 17.9s
100:	learn: 0.0146716	total: 10.9s	remaining: 17.7s
101:	learn: 0.0143946	total: 11s	remaining: 17.6s
102:	learn: 0.0142511	total: 11s	remaining: 17.5s
103:	learn: 0.0140243	total: 11.1s	remaining: 17.3s
104:	learn: 0.0137880	total: 11.2s	remaining: 17.2s
105:	learn: 0.0136261	total: 11.3s	remaining: 17.1s
106:	learn: 0.0133998	total: 11.5s	remaining: 17.1s
107:	learn: 0.0131575	total: 11.6s	remaining: 17s
108:	learn: 0.0130638	total: 11.7s	remaining: 16.8s
109:	learn: 0.0129915	total: 11.7s	remaining: 16.6s
110:	learn: 0.0128389	total: 11.8s	remaining: 16.5s
111:	learn: 0.0126119	total: 11.9s	remaining: 16.4s
112:	learn: 0.0124198	total: 12s	remaining: 16.2s
113:	learn: 0.0122269	total: 12.1s	remaining: 16.1s
114:	learn: 0.0121347	total: 12.2s	remaining: 16s
115:	learn: 0.0119710	total: 12.2s	remaining: 15.8s
116:	learn: 0.0117659	total: 12.3s	remaining: 15.7s
117:	learn: 0.0116613	total: 12.4s	remaining: 15.6s
118:	learn: 0.0114741	total: 12.5s	remaining: 15.5s
119:	learn: 0.0113498	total: 12.6s	remaining: 15.3s
120:	learn: 0.0112338	total: 12.7s	remaining: 15.2s
121:	learn: 0.0111222	total: 12.8s	remaining: 15.1s
122:	learn: 0.0109477	total: 12.8s	remaining: 14.9s
123:	learn: 0.0108435	total: 12.9s	remaining: 14.8s
124:	learn: 0.0107732	total: 13s	remaining: 14.6s
125:	learn: 0.0106422	total: 13.1s	remaining: 14.5s
126:	learn: 0.0105736	total: 13.2s	remaining: 14.4s
127:	learn: 0.0104045	total: 13.2s	remaining: 14.3s
128:	learn: 0.0103069	total: 13.3s	remaining: 14.1s
129:	learn: 0.0101322	total: 13.4s	remaining: 14s
130:	learn: 0.0100578	total: 13.5s	remaining: 13.9s
131:	learn: 0.0099904	total: 13.6s	remaining: 13.8s
132:	learn: 0.0099092	total: 13.6s	remaining: 13.6s
133:	learn: 0.0098287	total: 13.7s	remaining: 13.5s
134:	learn: 0.0096642	total: 13.8s	remaining: 13.4s
135:	learn: 0.0095356	total: 13.9s	remaining: 13.3s
136:	learn: 0.0094451	total: 14s	remaining: 13.2s
137:	learn: 0.0093551	total: 14.1s	remaining: 13.1s
138:	learn: 0.0092228	total: 14.2s	remaining: 12.9s
139:	learn: 0.0090326	total: 14.2s	remaining: 12.8s
140:	learn: 0.0089327	total: 14.3s	remaining: 12.7s
141:	learn: 0.0088652	total: 14.4s	remaining: 12.6s
142:	learn: 0.0088123	total: 14.5s	remaining: 12.5s
143:	learn: 0.0087224	total: 14.6s	remaining: 12.4s
144:	learn: 0.0086398	total: 14.7s	remaining: 12.2s
145:	learn: 0.0085964	total: 14.7s	remaining: 12.1s
146:	learn: 0.0085624	total: 14.8s	remaining: 12s
147:	learn: 0.0084876	total: 15s	remaining: 11.9s
148:	learn: 0.0084199	total: 15.1s	remaining: 11.9s
149:	learn: 0.0082779	total: 15.3s	remaining: 11.8s
150:	learn: 0.0082523	total: 15.4s	remaining: 11.7s
151:	learn: 0.0081593	total: 15.6s	remaining: 11.7s
152:	learn: 0.0081178	total: 15.7s	remaining: 11.6s
153:	learn: 0.0080725	total: 15.9s	remaining: 11.5s
154:	learn: 0.0079727	total: 16s	remaining: 11.5s
155:	learn: 0.0078971	total: 16.2s	remaining: 11.4s
156:	learn: 0.0077988	total: 16.4s	remaining: 11.4s
157:	learn: 0.0077007	total: 16.5s	remaining: 11.3s
158:	learn: 0.0076668	total: 16.7s	remaining: 11.2s
159:	learn: 0.0075431	total: 16.8s	remaining: 11.1s
160:	learn: 0.0075153	total: 17s	remaining: 11.1s
161:	learn: 0.0074827	total: 17.1s	remaining: 11s
162:	learn: 0.0074234	total: 17.3s	remaining: 10.9s
163:	learn: 0.0073423	total: 17.4s	remaining: 10.8s
164:	learn: 0.0072621	total: 17.6s	remaining: 10.8s
165:	learn: 0.0072342	total: 17.7s	remaining: 10.7s
166:	learn: 0.0070689	total: 17.9s	remaining: 10.6s
167:	learn: 0.0069795	total: 18.1s	remaining: 10.5s
168:	learn: 0.0069542	total: 18.2s	remaining: 10.5s
169:	learn: 0.0068768	total: 18.4s	remaining: 10.4s
170:	learn: 0.0068252	total: 18.5s	remaining: 10.3s
171:	learn: 0.0067998	total: 18.7s	remaining: 10.2s
172:	learn: 0.0067218	total: 18.8s	remaining: 10.1s
173:	learn: 0.0066666	total: 19s	remaining: 10s
174:	learn: 0.0066519	total: 19.2s	remaining: 9.96s
175:	learn: 0.0066229	total: 19.3s	remaining: 9.87s
176:	learn: 0.0065899	total: 19.5s	remaining: 9.78s
177:	learn: 0.0065417	total: 19.6s	remaining: 9.7s
178:	learn: 0.0065153	total: 19.8s	remaining: 9.61s
179:	learn: 0.0064884	total: 19.9s	remaining: 9.52s
180:	learn: 0.0064125	total: 20.1s	remaining: 9.45s
181:	learn: 0.0063802	total: 20.3s	remaining: 9.35s
182:	learn: 0.0063295	total: 20.4s	remaining: 9.26s
183:	learn: 0.0063176	total: 20.5s	remaining: 9.13s
184:	learn: 0.0062846	total: 20.6s	remaining: 9s
185:	learn: 0.0062164	total: 20.7s	remaining: 8.89s
186:	learn: 0.0061904	total: 20.8s	remaining: 8.77s
187:	learn: 0.0061468	total: 20.8s	remaining: 8.65s
188:	learn: 0.0061012	total: 20.9s	remaining: 8.53s
189:	learn: 0.0060425	total: 21s	remaining: 8.4s
190:	learn: 0.0059727	total: 21.1s	remaining: 8.28s
191:	learn: 0.0059421	total: 21.2s	remaining: 8.17s
192:	learn: 0.0058971	total: 21.3s	remaining: 8.05s
193:	learn: 0.0058773	total: 21.4s	remaining: 7.92s
194:	learn: 0.0058349	total: 21.4s	remaining: 7.81s
195:	learn: 0.0058057	total: 21.5s	remaining: 7.69s
196:	learn: 0.0057671	total: 21.6s	remaining: 7.57s
197:	learn: 0.0057415	total: 21.7s	remaining: 7.45s
198:	learn: 0.0056482	total: 21.8s	remaining: 7.34s
199:	learn: 0.0056141	total: 21.9s	remaining: 7.21s
200:	learn: 0.0055676	total: 22s	remaining: 7.1s
201:	learn: 0.0055401	total: 22s	remaining: 6.98s
202:	learn: 0.0055090	total: 22.1s	remaining: 6.86s
203:	learn: 0.0054274	total: 22.2s	remaining: 6.75s
204:	learn: 0.0053872	total: 22.3s	remaining: 6.63s
205:	learn: 0.0053872	total: 22.3s	remaining: 6.51s
206:	learn: 0.0053538	total: 22.4s	remaining: 6.39s
207:	learn: 0.0053300	total: 22.5s	remaining: 6.28s
208:	learn: 0.0053126	total: 22.6s	remaining: 6.16s
209:	learn: 0.0052703	total: 22.7s	remaining: 6.05s
210:	learn: 0.0052512	total: 22.8s	remaining: 5.94s
211:	learn: 0.0052258	total: 22.8s	remaining: 5.82s
212:	learn: 0.0051838	total: 22.9s	remaining: 5.71s
213:	learn: 0.0051661	total: 23s	remaining: 5.59s
214:	learn: 0.0051083	total: 23.1s	remaining: 5.48s
215:	learn: 0.0050936	total: 23.2s	remaining: 5.36s
216:	learn: 0.0050936	total: 23.2s	remaining: 5.25s
217:	learn: 0.0050527	total: 23.3s	remaining: 5.14s
218:	learn: 0.0050018	total: 23.4s	remaining: 5.03s
219:	learn: 0.0049742	total: 23.5s	remaining: 4.91s
220:	learn: 0.0049346	total: 23.6s	remaining: 4.8s
221:	learn: 0.0048668	total: 23.7s	remaining: 4.7s
222:	learn: 0.0048489	total: 23.8s	remaining: 4.58s
223:	learn: 0.0048225	total: 23.9s	remaining: 4.47s
224:	learn: 0.0048168	total: 23.9s	remaining: 4.36s
225:	learn: 0.0047509	total: 24s	remaining: 4.25s
226:	learn: 0.0047031	total: 24.1s	remaining: 4.14s
227:	learn: 0.0046898	total: 24.2s	remaining: 4.03s
228:	learn: 0.0046721	total: 24.3s	remaining: 3.92s
229:	learn: 0.0046327	total: 24.4s	remaining: 3.81s
230:	learn: 0.0045887	total: 24.5s	remaining: 3.71s
231:	learn: 0.0045502	total: 24.5s	remaining: 3.6s
232:	learn: 0.0045502	total: 24.6s	remaining: 3.48s
233:	learn: 0.0045502	total: 24.7s	remaining: 3.37s
234:	learn: 0.0045502	total: 24.7s	remaining: 3.26s
235:	learn: 0.0044609	total: 24.8s	remaining: 3.16s
236:	learn: 0.0044609	total: 24.9s	remaining: 3.05s
237:	learn: 0.0044277	total: 25s	remaining: 2.94s
238:	learn: 0.0044174	total: 25.1s	remaining: 2.83s
239:	learn: 0.0043850	total: 25.2s	remaining: 2.73s
240:	learn: 0.0043851	total: 25.2s	remaining: 2.62s
241:	learn: 0.0043851	total: 25.3s	remaining: 2.51s
242:	learn: 0.0043851	total: 25.3s	remaining: 2.4s
243:	learn: 0.0043851	total: 25.4s	remaining: 2.29s
244:	learn: 0.0043850	total: 25.5s	remaining: 2.18s
245:	learn: 0.0043480	total: 25.6s	remaining: 2.08s
246:	learn: 0.0043084	total: 25.6s	remaining: 1.97s
247:	learn: 0.0042942	total: 25.7s	remaining: 1.87s
248:	learn: 0.0042498	total: 25.8s	remaining: 1.76s
249:	learn: 0.0042190	total: 25.9s	remaining: 1.66s
250:	learn: 0.0042190	total: 26s	remaining: 1.55s
251:	learn: 0.0042030	total: 26s	remaining: 1.45s
252:	learn: 0.0041622	total: 26.1s	remaining: 1.34s
253:	learn: 0.0041308	total: 26.2s	remaining: 1.24s
254:	learn: 0.0040747	total: 26.3s	remaining: 1.13s
255:	learn: 0.0040473	total: 26.4s	remaining: 1.03s
256:	learn: 0.0040387	total: 26.5s	remaining: 927ms
257:	learn: 0.0040387	total: 26.5s	remaining: 823ms
258:	learn: 0.0040387	total: 26.6s	remaining: 719ms
259:	learn: 0.0040387	total: 26.7s	remaining: 616ms
260:	learn: 0.0040387	total: 26.7s	remaining: 512ms
261:	learn: 0.0040387	total: 26.8s	remaining: 409ms
262:	learn: 0.0040334	total: 26.9s	remaining: 306ms
263:	learn: 0.0040231	total: 27s	remaining: 204ms
264:	learn: 0.0040231	total: 27s	remaining: 102ms
265:	learn: 0.0040145	total: 27.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.48
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.74
 - F1-Score_Train: 99.74
 - Precision_Test: 20.37
 - Recall_Test: 88.10
 - AUPRC_Test: 76.22
 - Accuracy_Test: 99.40
 - F1-Score_Test: 33.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 266
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.41
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5252505	total: 75.2ms	remaining: 19.9s
1:	learn: 0.3893171	total: 154ms	remaining: 20.4s
2:	learn: 0.3056325	total: 228ms	remaining: 19.9s
3:	learn: 0.2510041	total: 316ms	remaining: 20.7s
4:	learn: 0.2011433	total: 391ms	remaining: 20.4s
5:	learn: 0.1726533	total: 469ms	remaining: 20.3s
6:	learn: 0.1560461	total: 560ms	remaining: 20.7s
7:	learn: 0.1361185	total: 635ms	remaining: 20.5s
8:	learn: 0.1259254	total: 707ms	remaining: 20.2s
9:	learn: 0.1131200	total: 811ms	remaining: 20.8s
10:	learn: 0.1059297	total: 891ms	remaining: 20.6s
11:	learn: 0.0998221	total: 958ms	remaining: 20.3s
12:	learn: 0.0922692	total: 1.05s	remaining: 20.4s
13:	learn: 0.0859390	total: 1.14s	remaining: 20.4s
14:	learn: 0.0801433	total: 1.21s	remaining: 20.3s
15:	learn: 0.0732269	total: 1.36s	remaining: 21.3s
16:	learn: 0.0691247	total: 1.5s	remaining: 21.9s
17:	learn: 0.0662832	total: 1.65s	remaining: 22.7s
18:	learn: 0.0640251	total: 1.79s	remaining: 23.3s
19:	learn: 0.0616940	total: 1.94s	remaining: 23.9s
20:	learn: 0.0590924	total: 2.12s	remaining: 24.8s
21:	learn: 0.0557555	total: 2.29s	remaining: 25.4s
22:	learn: 0.0541084	total: 2.44s	remaining: 25.8s
23:	learn: 0.0530582	total: 2.6s	remaining: 26.2s
24:	learn: 0.0510674	total: 2.76s	remaining: 26.6s
25:	learn: 0.0499791	total: 2.91s	remaining: 26.9s
26:	learn: 0.0479491	total: 3.07s	remaining: 27.1s
27:	learn: 0.0468191	total: 3.21s	remaining: 27.3s
28:	learn: 0.0457355	total: 3.37s	remaining: 27.5s
29:	learn: 0.0441860	total: 3.52s	remaining: 27.7s
30:	learn: 0.0429078	total: 3.68s	remaining: 27.9s
31:	learn: 0.0412573	total: 3.84s	remaining: 28.1s
32:	learn: 0.0402831	total: 4.01s	remaining: 28.3s
33:	learn: 0.0393715	total: 4.16s	remaining: 28.4s
34:	learn: 0.0381769	total: 4.31s	remaining: 28.5s
35:	learn: 0.0372532	total: 4.49s	remaining: 28.7s
36:	learn: 0.0364303	total: 4.65s	remaining: 28.8s
37:	learn: 0.0356904	total: 4.79s	remaining: 28.7s
38:	learn: 0.0346924	total: 4.91s	remaining: 28.6s
39:	learn: 0.0336143	total: 5.08s	remaining: 28.7s
40:	learn: 0.0328505	total: 5.22s	remaining: 28.7s
41:	learn: 0.0323020	total: 5.37s	remaining: 28.7s
42:	learn: 0.0317169	total: 5.52s	remaining: 28.6s
43:	learn: 0.0307465	total: 5.69s	remaining: 28.7s
44:	learn: 0.0303286	total: 5.84s	remaining: 28.7s
45:	learn: 0.0298301	total: 6s	remaining: 28.7s
46:	learn: 0.0292503	total: 6.17s	remaining: 28.8s
47:	learn: 0.0286453	total: 6.33s	remaining: 28.8s
48:	learn: 0.0280548	total: 6.49s	remaining: 28.7s
49:	learn: 0.0274256	total: 6.65s	remaining: 28.7s
50:	learn: 0.0268870	total: 6.81s	remaining: 28.7s
51:	learn: 0.0262826	total: 6.96s	remaining: 28.7s
52:	learn: 0.0258207	total: 7.09s	remaining: 28.5s
53:	learn: 0.0252123	total: 7.2s	remaining: 28.3s
54:	learn: 0.0248204	total: 7.29s	remaining: 28s
55:	learn: 0.0244327	total: 7.37s	remaining: 27.6s
56:	learn: 0.0239599	total: 7.47s	remaining: 27.4s
57:	learn: 0.0236163	total: 7.55s	remaining: 27.1s
58:	learn: 0.0231800	total: 7.64s	remaining: 26.8s
59:	learn: 0.0228789	total: 7.74s	remaining: 26.6s
60:	learn: 0.0224941	total: 7.82s	remaining: 26.3s
61:	learn: 0.0219064	total: 7.9s	remaining: 26s
62:	learn: 0.0213902	total: 8s	remaining: 25.8s
63:	learn: 0.0212031	total: 8.08s	remaining: 25.5s
64:	learn: 0.0209521	total: 8.15s	remaining: 25.2s
65:	learn: 0.0205835	total: 8.26s	remaining: 25s
66:	learn: 0.0203131	total: 8.33s	remaining: 24.7s
67:	learn: 0.0199501	total: 8.41s	remaining: 24.5s
68:	learn: 0.0195645	total: 8.51s	remaining: 24.3s
69:	learn: 0.0192051	total: 8.59s	remaining: 24s
70:	learn: 0.0188658	total: 8.67s	remaining: 23.8s
71:	learn: 0.0187572	total: 8.76s	remaining: 23.6s
72:	learn: 0.0184655	total: 8.84s	remaining: 23.4s
73:	learn: 0.0183077	total: 8.91s	remaining: 23.1s
74:	learn: 0.0180473	total: 9.01s	remaining: 23s
75:	learn: 0.0176889	total: 9.1s	remaining: 22.7s
76:	learn: 0.0174280	total: 9.18s	remaining: 22.5s
77:	learn: 0.0170450	total: 9.28s	remaining: 22.4s
78:	learn: 0.0166358	total: 9.36s	remaining: 22.2s
79:	learn: 0.0164968	total: 9.44s	remaining: 21.9s
80:	learn: 0.0163126	total: 9.53s	remaining: 21.8s
81:	learn: 0.0160854	total: 9.61s	remaining: 21.6s
82:	learn: 0.0158992	total: 9.7s	remaining: 21.4s
83:	learn: 0.0156382	total: 9.8s	remaining: 21.2s
84:	learn: 0.0152680	total: 9.88s	remaining: 21s
85:	learn: 0.0151099	total: 9.95s	remaining: 20.8s
86:	learn: 0.0149197	total: 10.1s	remaining: 20.7s
87:	learn: 0.0146881	total: 10.1s	remaining: 20.5s
88:	learn: 0.0144293	total: 10.2s	remaining: 20.3s
89:	learn: 0.0142189	total: 10.3s	remaining: 20.2s
90:	learn: 0.0139500	total: 10.4s	remaining: 20s
91:	learn: 0.0137427	total: 10.5s	remaining: 19.9s
92:	learn: 0.0135117	total: 10.6s	remaining: 19.7s
93:	learn: 0.0133400	total: 10.7s	remaining: 19.5s
94:	learn: 0.0132382	total: 10.7s	remaining: 19.3s
95:	learn: 0.0129994	total: 10.9s	remaining: 19.2s
96:	learn: 0.0127419	total: 10.9s	remaining: 19s
97:	learn: 0.0125671	total: 11s	remaining: 18.9s
98:	learn: 0.0123172	total: 11.1s	remaining: 18.7s
99:	learn: 0.0121576	total: 11.2s	remaining: 18.6s
100:	learn: 0.0120125	total: 11.3s	remaining: 18.4s
101:	learn: 0.0118355	total: 11.4s	remaining: 18.3s
102:	learn: 0.0117032	total: 11.5s	remaining: 18.1s
103:	learn: 0.0115163	total: 11.5s	remaining: 18s
104:	learn: 0.0113466	total: 11.6s	remaining: 17.8s
105:	learn: 0.0112849	total: 11.7s	remaining: 17.7s
106:	learn: 0.0110738	total: 11.8s	remaining: 17.5s
107:	learn: 0.0109007	total: 11.9s	remaining: 17.4s
108:	learn: 0.0107731	total: 12s	remaining: 17.2s
109:	learn: 0.0106179	total: 12.1s	remaining: 17.1s
110:	learn: 0.0105006	total: 12.2s	remaining: 17s
111:	learn: 0.0102562	total: 12.2s	remaining: 16.8s
112:	learn: 0.0101545	total: 12.3s	remaining: 16.7s
113:	learn: 0.0100508	total: 12.4s	remaining: 16.6s
114:	learn: 0.0099447	total: 12.5s	remaining: 16.4s
115:	learn: 0.0098177	total: 12.6s	remaining: 16.3s
116:	learn: 0.0096747	total: 12.7s	remaining: 16.2s
117:	learn: 0.0094962	total: 12.8s	remaining: 16s
118:	learn: 0.0094194	total: 12.8s	remaining: 15.9s
119:	learn: 0.0093734	total: 12.9s	remaining: 15.7s
120:	learn: 0.0092893	total: 13s	remaining: 15.6s
121:	learn: 0.0091869	total: 13.1s	remaining: 15.4s
122:	learn: 0.0090272	total: 13.2s	remaining: 15.3s
123:	learn: 0.0088766	total: 13.3s	remaining: 15.2s
124:	learn: 0.0087453	total: 13.3s	remaining: 15.1s
125:	learn: 0.0086591	total: 13.4s	remaining: 14.9s
126:	learn: 0.0085745	total: 13.5s	remaining: 14.8s
127:	learn: 0.0084945	total: 13.6s	remaining: 14.7s
128:	learn: 0.0083770	total: 13.7s	remaining: 14.6s
129:	learn: 0.0083198	total: 13.8s	remaining: 14.4s
130:	learn: 0.0082087	total: 13.9s	remaining: 14.3s
131:	learn: 0.0081424	total: 13.9s	remaining: 14.2s
132:	learn: 0.0080335	total: 14s	remaining: 14s
133:	learn: 0.0079559	total: 14.1s	remaining: 13.9s
134:	learn: 0.0078967	total: 14.2s	remaining: 13.8s
135:	learn: 0.0077833	total: 14.3s	remaining: 13.6s
136:	learn: 0.0077223	total: 14.4s	remaining: 13.5s
137:	learn: 0.0076082	total: 14.5s	remaining: 13.4s
138:	learn: 0.0075624	total: 14.5s	remaining: 13.3s
139:	learn: 0.0074709	total: 14.6s	remaining: 13.2s
140:	learn: 0.0073736	total: 14.7s	remaining: 13.1s
141:	learn: 0.0073081	total: 14.8s	remaining: 12.9s
142:	learn: 0.0072941	total: 14.9s	remaining: 12.8s
143:	learn: 0.0072149	total: 15s	remaining: 12.7s
144:	learn: 0.0071262	total: 15.1s	remaining: 12.6s
145:	learn: 0.0070514	total: 15.1s	remaining: 12.4s
146:	learn: 0.0070006	total: 15.2s	remaining: 12.3s
147:	learn: 0.0069456	total: 15.3s	remaining: 12.2s
148:	learn: 0.0068743	total: 15.4s	remaining: 12.1s
149:	learn: 0.0068155	total: 15.5s	remaining: 12s
150:	learn: 0.0067809	total: 15.6s	remaining: 11.9s
151:	learn: 0.0067618	total: 15.7s	remaining: 11.8s
152:	learn: 0.0066703	total: 15.8s	remaining: 11.6s
153:	learn: 0.0066156	total: 15.8s	remaining: 11.5s
154:	learn: 0.0065361	total: 15.9s	remaining: 11.4s
155:	learn: 0.0065108	total: 16s	remaining: 11.3s
156:	learn: 0.0064163	total: 16.1s	remaining: 11.2s
157:	learn: 0.0063970	total: 16.2s	remaining: 11s
158:	learn: 0.0063270	total: 16.3s	remaining: 10.9s
159:	learn: 0.0062680	total: 16.3s	remaining: 10.8s
160:	learn: 0.0062073	total: 16.4s	remaining: 10.7s
161:	learn: 0.0061850	total: 16.5s	remaining: 10.6s
162:	learn: 0.0061507	total: 16.6s	remaining: 10.5s
163:	learn: 0.0060958	total: 16.7s	remaining: 10.4s
164:	learn: 0.0060428	total: 16.8s	remaining: 10.3s
165:	learn: 0.0059832	total: 16.8s	remaining: 10.1s
166:	learn: 0.0059008	total: 16.9s	remaining: 10s
167:	learn: 0.0058556	total: 17s	remaining: 9.93s
168:	learn: 0.0057657	total: 17.1s	remaining: 9.84s
169:	learn: 0.0057331	total: 17.3s	remaining: 9.75s
170:	learn: 0.0056857	total: 17.4s	remaining: 9.67s
171:	learn: 0.0056404	total: 17.6s	remaining: 9.6s
172:	learn: 0.0056199	total: 17.7s	remaining: 9.53s
173:	learn: 0.0055639	total: 17.9s	remaining: 9.46s
174:	learn: 0.0055451	total: 18.1s	remaining: 9.39s
175:	learn: 0.0054838	total: 18.2s	remaining: 9.31s
176:	learn: 0.0054339	total: 18.4s	remaining: 9.23s
177:	learn: 0.0054012	total: 18.5s	remaining: 9.15s
178:	learn: 0.0053620	total: 18.7s	remaining: 9.06s
179:	learn: 0.0053417	total: 18.8s	remaining: 8.98s
180:	learn: 0.0053196	total: 18.9s	remaining: 8.89s
181:	learn: 0.0052456	total: 19.1s	remaining: 8.81s
182:	learn: 0.0051636	total: 19.3s	remaining: 8.73s
183:	learn: 0.0051319	total: 19.4s	remaining: 8.64s
184:	learn: 0.0050612	total: 19.6s	remaining: 8.56s
185:	learn: 0.0050126	total: 19.7s	remaining: 8.48s
186:	learn: 0.0050126	total: 19.8s	remaining: 8.37s
187:	learn: 0.0049892	total: 20s	remaining: 8.29s
188:	learn: 0.0049370	total: 20.2s	remaining: 8.21s
189:	learn: 0.0048938	total: 20.3s	remaining: 8.14s
190:	learn: 0.0048429	total: 20.5s	remaining: 8.05s
191:	learn: 0.0048289	total: 20.6s	remaining: 7.95s
192:	learn: 0.0047953	total: 20.8s	remaining: 7.86s
193:	learn: 0.0047911	total: 20.9s	remaining: 7.76s
194:	learn: 0.0047609	total: 21s	remaining: 7.66s
195:	learn: 0.0047062	total: 21.2s	remaining: 7.57s
196:	learn: 0.0046669	total: 21.3s	remaining: 7.47s
197:	learn: 0.0046066	total: 21.5s	remaining: 7.38s
198:	learn: 0.0045872	total: 21.6s	remaining: 7.28s
199:	learn: 0.0045617	total: 21.8s	remaining: 7.19s
200:	learn: 0.0045137	total: 22s	remaining: 7.1s
201:	learn: 0.0044912	total: 22.1s	remaining: 7.01s
202:	learn: 0.0044911	total: 22.2s	remaining: 6.9s
203:	learn: 0.0044352	total: 22.4s	remaining: 6.81s
204:	learn: 0.0044352	total: 22.5s	remaining: 6.7s
205:	learn: 0.0044352	total: 22.7s	remaining: 6.6s
206:	learn: 0.0044184	total: 22.8s	remaining: 6.5s
207:	learn: 0.0043859	total: 22.9s	remaining: 6.38s
208:	learn: 0.0043338	total: 23s	remaining: 6.27s
209:	learn: 0.0043099	total: 23.1s	remaining: 6.15s
210:	learn: 0.0042776	total: 23.2s	remaining: 6.04s
211:	learn: 0.0042639	total: 23.2s	remaining: 5.92s
212:	learn: 0.0042385	total: 23.3s	remaining: 5.8s
213:	learn: 0.0042114	total: 23.4s	remaining: 5.69s
214:	learn: 0.0041849	total: 23.5s	remaining: 5.57s
215:	learn: 0.0041522	total: 23.6s	remaining: 5.46s
216:	learn: 0.0041522	total: 23.6s	remaining: 5.34s
217:	learn: 0.0041521	total: 23.7s	remaining: 5.22s
218:	learn: 0.0041208	total: 23.8s	remaining: 5.11s
219:	learn: 0.0041147	total: 23.9s	remaining: 5s
220:	learn: 0.0040895	total: 24s	remaining: 4.88s
221:	learn: 0.0040275	total: 24.1s	remaining: 4.77s
222:	learn: 0.0039935	total: 24.2s	remaining: 4.66s
223:	learn: 0.0039419	total: 24.2s	remaining: 4.54s
224:	learn: 0.0039419	total: 24.3s	remaining: 4.43s
225:	learn: 0.0039419	total: 24.3s	remaining: 4.31s
226:	learn: 0.0039310	total: 24.4s	remaining: 4.2s
227:	learn: 0.0039233	total: 24.5s	remaining: 4.08s
228:	learn: 0.0038950	total: 24.6s	remaining: 3.97s
229:	learn: 0.0038626	total: 24.7s	remaining: 3.86s
230:	learn: 0.0038453	total: 24.8s	remaining: 3.75s
231:	learn: 0.0038197	total: 24.8s	remaining: 3.64s
232:	learn: 0.0038197	total: 24.9s	remaining: 3.53s
233:	learn: 0.0038197	total: 25s	remaining: 3.42s
234:	learn: 0.0038197	total: 25.1s	remaining: 3.3s
235:	learn: 0.0038197	total: 25.1s	remaining: 3.19s
236:	learn: 0.0038197	total: 25.2s	remaining: 3.08s
237:	learn: 0.0038197	total: 25.2s	remaining: 2.97s
238:	learn: 0.0038197	total: 25.3s	remaining: 2.86s
239:	learn: 0.0038197	total: 25.4s	remaining: 2.75s
240:	learn: 0.0038196	total: 25.4s	remaining: 2.64s
241:	learn: 0.0038196	total: 25.5s	remaining: 2.53s
242:	learn: 0.0038196	total: 25.6s	remaining: 2.42s
243:	learn: 0.0038196	total: 25.6s	remaining: 2.31s
244:	learn: 0.0038196	total: 25.7s	remaining: 2.21s
245:	learn: 0.0038197	total: 25.8s	remaining: 2.1s
246:	learn: 0.0038197	total: 25.8s	remaining: 1.99s
247:	learn: 0.0038197	total: 25.9s	remaining: 1.88s
248:	learn: 0.0038197	total: 26s	remaining: 1.77s
249:	learn: 0.0038196	total: 26.1s	remaining: 1.67s
250:	learn: 0.0038196	total: 26.1s	remaining: 1.56s
251:	learn: 0.0038196	total: 26.2s	remaining: 1.45s
252:	learn: 0.0038196	total: 26.3s	remaining: 1.35s
253:	learn: 0.0038196	total: 26.3s	remaining: 1.24s
254:	learn: 0.0038196	total: 26.4s	remaining: 1.14s
255:	learn: 0.0038196	total: 26.4s	remaining: 1.03s
256:	learn: 0.0038196	total: 26.5s	remaining: 929ms
257:	learn: 0.0038196	total: 26.6s	remaining: 824ms
258:	learn: 0.0038196	total: 26.6s	remaining: 720ms
259:	learn: 0.0038196	total: 26.7s	remaining: 616ms
260:	learn: 0.0038196	total: 26.8s	remaining: 513ms
261:	learn: 0.0038196	total: 26.8s	remaining: 410ms
262:	learn: 0.0038196	total: 26.9s	remaining: 307ms
263:	learn: 0.0038196	total: 27s	remaining: 204ms
264:	learn: 0.0038196	total: 27.1s	remaining: 102ms
265:	learn: 0.0038196	total: 27.1s	remaining: 0us
[I 2024-12-19 14:32:46,726] Trial 21 finished with value: 76.56551175173995 and parameters: {'learning_rate': 0.07337676836051153, 'max_depth': 5, 'n_estimators': 266, 'scale_pos_weight': 7.414132310718996}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.51
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 20.45
 - Recall_Test: 86.51
 - AUPRC_Test: 75.97
 - Accuracy_Test: 99.41
 - F1-Score_Test: 33.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 266
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.41
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 76.5655

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4996998	total: 78.3ms	remaining: 21.5s
1:	learn: 0.3547361	total: 160ms	remaining: 21.9s
2:	learn: 0.2689399	total: 239ms	remaining: 21.7s
3:	learn: 0.2081967	total: 346ms	remaining: 23.4s
4:	learn: 0.1548221	total: 430ms	remaining: 23.2s
5:	learn: 0.1270316	total: 513ms	remaining: 23s
6:	learn: 0.1041767	total: 609ms	remaining: 23.3s
7:	learn: 0.0888289	total: 691ms	remaining: 23s
8:	learn: 0.0810009	total: 782ms	remaining: 23.1s
9:	learn: 0.0740738	total: 881ms	remaining: 23.4s
10:	learn: 0.0678263	total: 965ms	remaining: 23.2s
11:	learn: 0.0623867	total: 1.06s	remaining: 23.2s
12:	learn: 0.0567652	total: 1.15s	remaining: 23.2s
13:	learn: 0.0530505	total: 1.25s	remaining: 23.2s
14:	learn: 0.0507159	total: 1.33s	remaining: 23s
15:	learn: 0.0468616	total: 1.43s	remaining: 23.1s
16:	learn: 0.0443087	total: 1.51s	remaining: 22.9s
17:	learn: 0.0429028	total: 1.58s	remaining: 22.6s
18:	learn: 0.0414974	total: 1.68s	remaining: 22.6s
19:	learn: 0.0392382	total: 1.77s	remaining: 22.6s
20:	learn: 0.0382720	total: 1.85s	remaining: 22.4s
21:	learn: 0.0371336	total: 1.94s	remaining: 22.3s
22:	learn: 0.0359766	total: 2.03s	remaining: 22.2s
23:	learn: 0.0349398	total: 2.1s	remaining: 22s
24:	learn: 0.0337356	total: 2.21s	remaining: 22.1s
25:	learn: 0.0327966	total: 2.29s	remaining: 21.9s
26:	learn: 0.0317231	total: 2.37s	remaining: 21.7s
27:	learn: 0.0306039	total: 2.46s	remaining: 21.7s
28:	learn: 0.0295202	total: 2.54s	remaining: 21.6s
29:	learn: 0.0289992	total: 2.63s	remaining: 21.5s
30:	learn: 0.0281438	total: 2.73s	remaining: 21.5s
31:	learn: 0.0273531	total: 2.82s	remaining: 21.4s
32:	learn: 0.0265975	total: 2.89s	remaining: 21.2s
33:	learn: 0.0257308	total: 2.99s	remaining: 21.2s
34:	learn: 0.0249598	total: 3.07s	remaining: 21.1s
35:	learn: 0.0244103	total: 3.14s	remaining: 20.9s
36:	learn: 0.0237112	total: 3.24s	remaining: 20.8s
37:	learn: 0.0230496	total: 3.33s	remaining: 20.8s
38:	learn: 0.0224386	total: 3.4s	remaining: 20.6s
39:	learn: 0.0216911	total: 3.53s	remaining: 20.7s
40:	learn: 0.0213158	total: 3.63s	remaining: 20.7s
41:	learn: 0.0207772	total: 3.79s	remaining: 21s
42:	learn: 0.0202907	total: 3.93s	remaining: 21.2s
43:	learn: 0.0197830	total: 4.12s	remaining: 21.6s
44:	learn: 0.0192700	total: 4.29s	remaining: 21.9s
45:	learn: 0.0186530	total: 4.45s	remaining: 22.2s
46:	learn: 0.0183180	total: 4.59s	remaining: 22.3s
47:	learn: 0.0178104	total: 4.78s	remaining: 22.6s
48:	learn: 0.0173692	total: 4.94s	remaining: 22.8s
49:	learn: 0.0170980	total: 5.08s	remaining: 22.9s
50:	learn: 0.0167965	total: 5.21s	remaining: 22.9s
51:	learn: 0.0163785	total: 5.38s	remaining: 23.1s
52:	learn: 0.0161065	total: 5.52s	remaining: 23.1s
53:	learn: 0.0158637	total: 5.66s	remaining: 23.2s
54:	learn: 0.0155492	total: 5.83s	remaining: 23.3s
55:	learn: 0.0152779	total: 6s	remaining: 23.5s
56:	learn: 0.0149764	total: 6.16s	remaining: 23.6s
57:	learn: 0.0147650	total: 6.3s	remaining: 23.6s
58:	learn: 0.0144925	total: 6.46s	remaining: 23.6s
59:	learn: 0.0143128	total: 6.62s	remaining: 23.7s
60:	learn: 0.0139785	total: 6.79s	remaining: 23.8s
61:	learn: 0.0138018	total: 6.94s	remaining: 23.8s
62:	learn: 0.0134244	total: 7.11s	remaining: 23.9s
63:	learn: 0.0132859	total: 7.29s	remaining: 24s
64:	learn: 0.0130184	total: 7.44s	remaining: 24s
65:	learn: 0.0127104	total: 7.62s	remaining: 24.1s
66:	learn: 0.0125284	total: 7.76s	remaining: 24.1s
67:	learn: 0.0123050	total: 7.94s	remaining: 24.2s
68:	learn: 0.0121227	total: 8.07s	remaining: 24.1s
69:	learn: 0.0119167	total: 8.24s	remaining: 24.1s
70:	learn: 0.0117588	total: 8.38s	remaining: 24.1s
71:	learn: 0.0116113	total: 8.53s	remaining: 24s
72:	learn: 0.0114492	total: 8.68s	remaining: 24s
73:	learn: 0.0112784	total: 8.78s	remaining: 23.8s
74:	learn: 0.0110952	total: 8.86s	remaining: 23.6s
75:	learn: 0.0109592	total: 8.93s	remaining: 23.4s
76:	learn: 0.0108190	total: 9.03s	remaining: 23.2s
77:	learn: 0.0106423	total: 9.1s	remaining: 23s
78:	learn: 0.0105116	total: 9.2s	remaining: 22.8s
79:	learn: 0.0103351	total: 9.29s	remaining: 22.6s
80:	learn: 0.0102692	total: 9.36s	remaining: 22.4s
81:	learn: 0.0100879	total: 9.45s	remaining: 22.2s
82:	learn: 0.0100054	total: 9.54s	remaining: 22.1s
83:	learn: 0.0098097	total: 9.63s	remaining: 21.9s
84:	learn: 0.0095860	total: 9.71s	remaining: 21.7s
85:	learn: 0.0094322	total: 9.81s	remaining: 21.6s
86:	learn: 0.0092926	total: 9.88s	remaining: 21.4s
87:	learn: 0.0090752	total: 9.97s	remaining: 21.2s
88:	learn: 0.0088854	total: 10.1s	remaining: 21s
89:	learn: 0.0087444	total: 10.2s	remaining: 20.9s
90:	learn: 0.0086638	total: 10.2s	remaining: 20.7s
91:	learn: 0.0085204	total: 10.3s	remaining: 20.6s
92:	learn: 0.0083625	total: 10.4s	remaining: 20.4s
93:	learn: 0.0082904	total: 10.5s	remaining: 20.2s
94:	learn: 0.0081520	total: 10.6s	remaining: 20.1s
95:	learn: 0.0080384	total: 10.7s	remaining: 19.9s
96:	learn: 0.0079276	total: 10.7s	remaining: 19.7s
97:	learn: 0.0078237	total: 10.8s	remaining: 19.6s
98:	learn: 0.0077257	total: 10.9s	remaining: 19.4s
99:	learn: 0.0075581	total: 11s	remaining: 19.2s
100:	learn: 0.0074355	total: 11.1s	remaining: 19.1s
101:	learn: 0.0073310	total: 11.2s	remaining: 19s
102:	learn: 0.0072215	total: 11.3s	remaining: 18.8s
103:	learn: 0.0071029	total: 11.4s	remaining: 18.7s
104:	learn: 0.0070764	total: 11.5s	remaining: 18.6s
105:	learn: 0.0069937	total: 11.5s	remaining: 18.4s
106:	learn: 0.0069569	total: 11.6s	remaining: 18.3s
107:	learn: 0.0068513	total: 11.7s	remaining: 18.1s
108:	learn: 0.0067573	total: 11.8s	remaining: 18s
109:	learn: 0.0066354	total: 11.9s	remaining: 17.8s
110:	learn: 0.0066002	total: 12s	remaining: 17.7s
111:	learn: 0.0065377	total: 12s	remaining: 17.5s
112:	learn: 0.0064805	total: 12.1s	remaining: 17.4s
113:	learn: 0.0064263	total: 12.2s	remaining: 17.3s
114:	learn: 0.0063527	total: 12.3s	remaining: 17.1s
115:	learn: 0.0062610	total: 12.4s	remaining: 17s
116:	learn: 0.0061642	total: 12.5s	remaining: 16.9s
117:	learn: 0.0061159	total: 12.6s	remaining: 16.7s
118:	learn: 0.0060577	total: 12.7s	remaining: 16.6s
119:	learn: 0.0059982	total: 12.7s	remaining: 16.4s
120:	learn: 0.0059383	total: 12.8s	remaining: 16.3s
121:	learn: 0.0058653	total: 12.9s	remaining: 16.2s
122:	learn: 0.0057808	total: 13s	remaining: 16s
123:	learn: 0.0057093	total: 13.1s	remaining: 15.9s
124:	learn: 0.0056510	total: 13.2s	remaining: 15.8s
125:	learn: 0.0055879	total: 13.2s	remaining: 15.7s
126:	learn: 0.0055342	total: 13.3s	remaining: 15.5s
127:	learn: 0.0054608	total: 13.4s	remaining: 15.4s
128:	learn: 0.0053923	total: 13.5s	remaining: 15.3s
129:	learn: 0.0053639	total: 13.6s	remaining: 15.2s
130:	learn: 0.0053291	total: 13.7s	remaining: 15s
131:	learn: 0.0052883	total: 13.7s	remaining: 14.9s
132:	learn: 0.0052332	total: 13.8s	remaining: 14.8s
133:	learn: 0.0051289	total: 13.9s	remaining: 14.7s
134:	learn: 0.0051039	total: 14s	remaining: 14.5s
135:	learn: 0.0050632	total: 14.1s	remaining: 14.4s
136:	learn: 0.0050249	total: 14.2s	remaining: 14.3s
137:	learn: 0.0050076	total: 14.3s	remaining: 14.2s
138:	learn: 0.0049550	total: 14.4s	remaining: 14s
139:	learn: 0.0049188	total: 14.4s	remaining: 13.9s
140:	learn: 0.0048168	total: 14.5s	remaining: 13.8s
141:	learn: 0.0047712	total: 14.6s	remaining: 13.7s
142:	learn: 0.0047059	total: 14.7s	remaining: 13.6s
143:	learn: 0.0046317	total: 14.8s	remaining: 13.5s
144:	learn: 0.0045617	total: 14.9s	remaining: 13.3s
145:	learn: 0.0045428	total: 14.9s	remaining: 13.2s
146:	learn: 0.0045134	total: 15s	remaining: 13.1s
147:	learn: 0.0044822	total: 15.1s	remaining: 13s
148:	learn: 0.0044334	total: 15.2s	remaining: 12.8s
149:	learn: 0.0043872	total: 15.3s	remaining: 12.7s
150:	learn: 0.0043440	total: 15.4s	remaining: 12.6s
151:	learn: 0.0043006	total: 15.4s	remaining: 12.5s
152:	learn: 0.0042697	total: 15.5s	remaining: 12.4s
153:	learn: 0.0042265	total: 15.6s	remaining: 12.3s
154:	learn: 0.0041578	total: 15.7s	remaining: 12.2s
155:	learn: 0.0041578	total: 15.8s	remaining: 12s
156:	learn: 0.0041157	total: 15.8s	remaining: 11.9s
157:	learn: 0.0040946	total: 15.9s	remaining: 11.8s
158:	learn: 0.0040821	total: 16s	remaining: 11.7s
159:	learn: 0.0040458	total: 16.1s	remaining: 11.6s
160:	learn: 0.0039941	total: 16.2s	remaining: 11.5s
161:	learn: 0.0039575	total: 16.3s	remaining: 11.3s
162:	learn: 0.0039328	total: 16.4s	remaining: 11.2s
163:	learn: 0.0039051	total: 16.4s	remaining: 11.1s
164:	learn: 0.0039051	total: 16.5s	remaining: 11s
165:	learn: 0.0039051	total: 16.6s	remaining: 10.9s
166:	learn: 0.0038893	total: 16.7s	remaining: 10.8s
167:	learn: 0.0038696	total: 16.7s	remaining: 10.7s
168:	learn: 0.0038397	total: 16.8s	remaining: 10.5s
169:	learn: 0.0038048	total: 16.9s	remaining: 10.4s
170:	learn: 0.0037875	total: 17s	remaining: 10.3s
171:	learn: 0.0037565	total: 17.1s	remaining: 10.2s
172:	learn: 0.0037565	total: 17.1s	remaining: 10.1s
173:	learn: 0.0037253	total: 17.2s	remaining: 10s
174:	learn: 0.0036941	total: 17.3s	remaining: 9.89s
175:	learn: 0.0036941	total: 17.4s	remaining: 9.78s
176:	learn: 0.0036941	total: 17.5s	remaining: 9.66s
177:	learn: 0.0036817	total: 17.5s	remaining: 9.55s
178:	learn: 0.0036466	total: 17.6s	remaining: 9.45s
179:	learn: 0.0036466	total: 17.7s	remaining: 9.34s
180:	learn: 0.0036244	total: 17.8s	remaining: 9.23s
181:	learn: 0.0036089	total: 17.9s	remaining: 9.13s
182:	learn: 0.0035920	total: 17.9s	remaining: 9.02s
183:	learn: 0.0035648	total: 18s	remaining: 8.91s
184:	learn: 0.0035239	total: 18.1s	remaining: 8.81s
185:	learn: 0.0034861	total: 18.2s	remaining: 8.7s
186:	learn: 0.0034581	total: 18.3s	remaining: 8.59s
187:	learn: 0.0034580	total: 18.3s	remaining: 8.48s
188:	learn: 0.0034580	total: 18.4s	remaining: 8.38s
189:	learn: 0.0034580	total: 18.5s	remaining: 8.26s
190:	learn: 0.0034580	total: 18.5s	remaining: 8.15s
191:	learn: 0.0034580	total: 18.6s	remaining: 8.05s
192:	learn: 0.0034580	total: 18.7s	remaining: 7.95s
193:	learn: 0.0034580	total: 18.8s	remaining: 7.85s
194:	learn: 0.0034580	total: 18.9s	remaining: 7.76s
195:	learn: 0.0034580	total: 19s	remaining: 7.67s
196:	learn: 0.0034579	total: 19.1s	remaining: 7.58s
197:	learn: 0.0034579	total: 19.3s	remaining: 7.5s
198:	learn: 0.0034579	total: 19.4s	remaining: 7.41s
199:	learn: 0.0034580	total: 19.5s	remaining: 7.33s
200:	learn: 0.0034580	total: 19.7s	remaining: 7.24s
201:	learn: 0.0034579	total: 19.8s	remaining: 7.15s
202:	learn: 0.0034579	total: 19.9s	remaining: 7.07s
203:	learn: 0.0034579	total: 20s	remaining: 6.98s
204:	learn: 0.0034579	total: 20.2s	remaining: 6.89s
205:	learn: 0.0034579	total: 20.3s	remaining: 6.8s
206:	learn: 0.0034579	total: 20.4s	remaining: 6.71s
207:	learn: 0.0034579	total: 20.5s	remaining: 6.62s
208:	learn: 0.0034579	total: 20.7s	remaining: 6.53s
209:	learn: 0.0034579	total: 20.8s	remaining: 6.43s
210:	learn: 0.0034579	total: 20.9s	remaining: 6.34s
211:	learn: 0.0034578	total: 21.1s	remaining: 6.26s
212:	learn: 0.0034578	total: 21.2s	remaining: 6.17s
213:	learn: 0.0034578	total: 21.3s	remaining: 6.08s
214:	learn: 0.0034578	total: 21.5s	remaining: 5.99s
215:	learn: 0.0034578	total: 21.6s	remaining: 5.89s
216:	learn: 0.0034578	total: 21.7s	remaining: 5.8s
217:	learn: 0.0034578	total: 21.8s	remaining: 5.71s
218:	learn: 0.0034578	total: 22s	remaining: 5.62s
219:	learn: 0.0034578	total: 22.1s	remaining: 5.52s
220:	learn: 0.0034577	total: 22.2s	remaining: 5.43s
221:	learn: 0.0034577	total: 22.3s	remaining: 5.33s
222:	learn: 0.0034577	total: 22.5s	remaining: 5.24s
223:	learn: 0.0034577	total: 22.6s	remaining: 5.14s
224:	learn: 0.0034577	total: 22.7s	remaining: 5.05s
225:	learn: 0.0034577	total: 22.8s	remaining: 4.95s
226:	learn: 0.0034577	total: 23s	remaining: 4.86s
227:	learn: 0.0034577	total: 23.1s	remaining: 4.76s
228:	learn: 0.0034577	total: 23.2s	remaining: 4.67s
229:	learn: 0.0034576	total: 23.4s	remaining: 4.57s
230:	learn: 0.0034577	total: 23.5s	remaining: 4.47s
231:	learn: 0.0034576	total: 23.6s	remaining: 4.38s
232:	learn: 0.0034576	total: 23.7s	remaining: 4.28s
233:	learn: 0.0034576	total: 23.9s	remaining: 4.18s
234:	learn: 0.0034576	total: 24s	remaining: 4.08s
235:	learn: 0.0034576	total: 24.1s	remaining: 3.98s
236:	learn: 0.0034575	total: 24.3s	remaining: 3.89s
237:	learn: 0.0034575	total: 24.4s	remaining: 3.79s
238:	learn: 0.0034575	total: 24.4s	remaining: 3.68s
239:	learn: 0.0034575	total: 24.5s	remaining: 3.58s
240:	learn: 0.0034575	total: 24.6s	remaining: 3.47s
241:	learn: 0.0034575	total: 24.7s	remaining: 3.36s
242:	learn: 0.0034574	total: 24.7s	remaining: 3.26s
243:	learn: 0.0034574	total: 24.8s	remaining: 3.15s
244:	learn: 0.0034574	total: 24.9s	remaining: 3.04s
245:	learn: 0.0034574	total: 24.9s	remaining: 2.94s
246:	learn: 0.0034574	total: 25s	remaining: 2.83s
247:	learn: 0.0034574	total: 25.1s	remaining: 2.73s
248:	learn: 0.0034574	total: 25.1s	remaining: 2.62s
249:	learn: 0.0034574	total: 25.2s	remaining: 2.52s
250:	learn: 0.0034574	total: 25.3s	remaining: 2.42s
251:	learn: 0.0034574	total: 25.3s	remaining: 2.31s
252:	learn: 0.0034574	total: 25.4s	remaining: 2.21s
253:	learn: 0.0034574	total: 25.4s	remaining: 2.1s
254:	learn: 0.0034574	total: 25.5s	remaining: 2s
255:	learn: 0.0034574	total: 25.6s	remaining: 1.9s
256:	learn: 0.0034573	total: 25.7s	remaining: 1.8s
257:	learn: 0.0034573	total: 25.7s	remaining: 1.7s
258:	learn: 0.0034573	total: 25.8s	remaining: 1.59s
259:	learn: 0.0034573	total: 25.9s	remaining: 1.49s
260:	learn: 0.0034573	total: 25.9s	remaining: 1.39s
261:	learn: 0.0034573	total: 26s	remaining: 1.29s
262:	learn: 0.0034573	total: 26.1s	remaining: 1.19s
263:	learn: 0.0034573	total: 26.1s	remaining: 1.09s
264:	learn: 0.0034573	total: 26.2s	remaining: 988ms
265:	learn: 0.0034573	total: 26.2s	remaining: 888ms
266:	learn: 0.0034573	total: 26.3s	remaining: 789ms
267:	learn: 0.0034573	total: 26.4s	remaining: 689ms
268:	learn: 0.0034573	total: 26.4s	remaining: 590ms
269:	learn: 0.0034573	total: 26.5s	remaining: 491ms
270:	learn: 0.0034573	total: 26.6s	remaining: 393ms
271:	learn: 0.0034573	total: 26.7s	remaining: 294ms
272:	learn: 0.0034573	total: 26.7s	remaining: 196ms
273:	learn: 0.0034573	total: 26.8s	remaining: 97.8ms
274:	learn: 0.0034573	total: 26.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.36
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.68
 - F1-Score_Train: 99.68
 - Precision_Test: 16.79
 - Recall_Test: 88.10
 - AUPRC_Test: 77.58
 - Accuracy_Test: 99.25
 - F1-Score_Test: 28.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.59
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5099367	total: 78ms	remaining: 21.4s
1:	learn: 0.3822216	total: 152ms	remaining: 20.7s
2:	learn: 0.2943974	total: 236ms	remaining: 21.4s
3:	learn: 0.2340781	total: 326ms	remaining: 22.1s
4:	learn: 0.1923719	total: 404ms	remaining: 21.8s
5:	learn: 0.1710872	total: 484ms	remaining: 21.7s
6:	learn: 0.1505333	total: 573ms	remaining: 21.9s
7:	learn: 0.1337241	total: 658ms	remaining: 22s
8:	learn: 0.1170292	total: 737ms	remaining: 21.8s
9:	learn: 0.1080678	total: 844ms	remaining: 22.4s
10:	learn: 0.1012168	total: 920ms	remaining: 22.1s
11:	learn: 0.0923579	total: 1s	remaining: 21.9s
12:	learn: 0.0862251	total: 1.09s	remaining: 22s
13:	learn: 0.0803319	total: 1.16s	remaining: 21.7s
14:	learn: 0.0755205	total: 1.24s	remaining: 21.5s
15:	learn: 0.0712030	total: 1.34s	remaining: 21.7s
16:	learn: 0.0693517	total: 1.41s	remaining: 21.4s
17:	learn: 0.0665077	total: 1.5s	remaining: 21.4s
18:	learn: 0.0640040	total: 1.6s	remaining: 21.5s
19:	learn: 0.0616577	total: 1.69s	remaining: 21.5s
20:	learn: 0.0595883	total: 1.76s	remaining: 21.4s
21:	learn: 0.0569157	total: 1.89s	remaining: 21.7s
22:	learn: 0.0539239	total: 1.96s	remaining: 21.5s
23:	learn: 0.0521707	total: 2.08s	remaining: 21.7s
24:	learn: 0.0503485	total: 2.21s	remaining: 22.1s
25:	learn: 0.0489248	total: 2.35s	remaining: 22.6s
26:	learn: 0.0476714	total: 2.54s	remaining: 23.3s
27:	learn: 0.0464998	total: 2.69s	remaining: 23.8s
28:	learn: 0.0453490	total: 2.85s	remaining: 24.2s
29:	learn: 0.0438442	total: 3.02s	remaining: 24.7s
30:	learn: 0.0426695	total: 3.19s	remaining: 25.1s
31:	learn: 0.0421222	total: 3.33s	remaining: 25.3s
32:	learn: 0.0404295	total: 3.51s	remaining: 25.8s
33:	learn: 0.0391510	total: 3.67s	remaining: 26s
34:	learn: 0.0382597	total: 3.82s	remaining: 26.2s
35:	learn: 0.0375041	total: 3.95s	remaining: 26.2s
36:	learn: 0.0363618	total: 4.12s	remaining: 26.5s
37:	learn: 0.0347778	total: 4.28s	remaining: 26.7s
38:	learn: 0.0340528	total: 4.44s	remaining: 26.9s
39:	learn: 0.0334627	total: 4.6s	remaining: 27s
40:	learn: 0.0328076	total: 4.75s	remaining: 27.1s
41:	learn: 0.0319532	total: 4.91s	remaining: 27.2s
42:	learn: 0.0311906	total: 5.08s	remaining: 27.4s
43:	learn: 0.0302595	total: 5.24s	remaining: 27.5s
44:	learn: 0.0297149	total: 5.41s	remaining: 27.7s
45:	learn: 0.0289763	total: 5.6s	remaining: 27.9s
46:	learn: 0.0281553	total: 5.77s	remaining: 28s
47:	learn: 0.0275222	total: 5.93s	remaining: 28s
48:	learn: 0.0269097	total: 6.08s	remaining: 28.1s
49:	learn: 0.0262607	total: 6.25s	remaining: 28.1s
50:	learn: 0.0257378	total: 6.41s	remaining: 28.1s
51:	learn: 0.0253417	total: 6.58s	remaining: 28.2s
52:	learn: 0.0247966	total: 6.75s	remaining: 28.3s
53:	learn: 0.0242713	total: 6.9s	remaining: 28.3s
54:	learn: 0.0237391	total: 7.07s	remaining: 28.3s
55:	learn: 0.0234888	total: 7.23s	remaining: 28.3s
56:	learn: 0.0230470	total: 7.39s	remaining: 28.3s
57:	learn: 0.0226979	total: 7.57s	remaining: 28.3s
58:	learn: 0.0223568	total: 7.71s	remaining: 28.2s
59:	learn: 0.0219131	total: 7.88s	remaining: 28.2s
60:	learn: 0.0214389	total: 8.04s	remaining: 28.2s
61:	learn: 0.0211151	total: 8.19s	remaining: 28.2s
62:	learn: 0.0207553	total: 8.37s	remaining: 28.2s
63:	learn: 0.0204345	total: 8.53s	remaining: 28.1s
64:	learn: 0.0200845	total: 8.68s	remaining: 28s
65:	learn: 0.0196379	total: 8.84s	remaining: 28s
66:	learn: 0.0193464	total: 8.99s	remaining: 27.9s
67:	learn: 0.0190685	total: 9.13s	remaining: 27.8s
68:	learn: 0.0186952	total: 9.29s	remaining: 27.7s
69:	learn: 0.0185136	total: 9.46s	remaining: 27.7s
70:	learn: 0.0182556	total: 9.61s	remaining: 27.6s
71:	learn: 0.0180315	total: 9.76s	remaining: 27.5s
72:	learn: 0.0177791	total: 9.93s	remaining: 27.5s
73:	learn: 0.0176115	total: 10.1s	remaining: 27.4s
74:	learn: 0.0173110	total: 10.2s	remaining: 27.3s
75:	learn: 0.0168973	total: 10.4s	remaining: 27.1s
76:	learn: 0.0167363	total: 10.5s	remaining: 27.1s
77:	learn: 0.0165631	total: 10.7s	remaining: 27s
78:	learn: 0.0163673	total: 10.8s	remaining: 26.8s
79:	learn: 0.0160396	total: 11s	remaining: 26.7s
80:	learn: 0.0156672	total: 11.1s	remaining: 26.7s
81:	learn: 0.0154187	total: 11.3s	remaining: 26.6s
82:	learn: 0.0152145	total: 11.4s	remaining: 26.5s
83:	learn: 0.0150280	total: 11.6s	remaining: 26.4s
84:	learn: 0.0148604	total: 11.8s	remaining: 26.3s
85:	learn: 0.0145495	total: 11.9s	remaining: 26.2s
86:	learn: 0.0143443	total: 12.1s	remaining: 26.1s
87:	learn: 0.0139849	total: 12.2s	remaining: 26s
88:	learn: 0.0138251	total: 12.4s	remaining: 25.9s
89:	learn: 0.0135266	total: 12.5s	remaining: 25.7s
90:	learn: 0.0132948	total: 12.6s	remaining: 25.4s
91:	learn: 0.0131820	total: 12.7s	remaining: 25.2s
92:	learn: 0.0129438	total: 12.8s	remaining: 25s
93:	learn: 0.0127548	total: 12.8s	remaining: 24.7s
94:	learn: 0.0125991	total: 13s	remaining: 24.6s
95:	learn: 0.0122884	total: 13.1s	remaining: 24.5s
96:	learn: 0.0121919	total: 13.2s	remaining: 24.3s
97:	learn: 0.0120454	total: 13.3s	remaining: 24s
98:	learn: 0.0119374	total: 13.4s	remaining: 23.8s
99:	learn: 0.0118813	total: 13.5s	remaining: 23.6s
100:	learn: 0.0117740	total: 13.6s	remaining: 23.4s
101:	learn: 0.0115575	total: 13.7s	remaining: 23.2s
102:	learn: 0.0114923	total: 13.7s	remaining: 23s
103:	learn: 0.0113629	total: 13.8s	remaining: 22.7s
104:	learn: 0.0112907	total: 13.9s	remaining: 22.5s
105:	learn: 0.0110873	total: 14s	remaining: 22.3s
106:	learn: 0.0109303	total: 14.1s	remaining: 22.1s
107:	learn: 0.0107924	total: 14.1s	remaining: 21.9s
108:	learn: 0.0106152	total: 14.2s	remaining: 21.7s
109:	learn: 0.0104527	total: 14.3s	remaining: 21.5s
110:	learn: 0.0103230	total: 14.4s	remaining: 21.3s
111:	learn: 0.0101600	total: 14.5s	remaining: 21.1s
112:	learn: 0.0100058	total: 14.6s	remaining: 20.9s
113:	learn: 0.0098281	total: 14.7s	remaining: 20.7s
114:	learn: 0.0096426	total: 14.8s	remaining: 20.5s
115:	learn: 0.0094916	total: 14.8s	remaining: 20.3s
116:	learn: 0.0094177	total: 14.9s	remaining: 20.1s
117:	learn: 0.0093113	total: 15s	remaining: 20s
118:	learn: 0.0092410	total: 15.1s	remaining: 19.8s
119:	learn: 0.0091081	total: 15.2s	remaining: 19.6s
120:	learn: 0.0090282	total: 15.3s	remaining: 19.4s
121:	learn: 0.0089176	total: 15.3s	remaining: 19.2s
122:	learn: 0.0087902	total: 15.4s	remaining: 19s
123:	learn: 0.0086600	total: 15.5s	remaining: 18.9s
124:	learn: 0.0085647	total: 15.6s	remaining: 18.7s
125:	learn: 0.0085099	total: 15.7s	remaining: 18.6s
126:	learn: 0.0084077	total: 15.8s	remaining: 18.4s
127:	learn: 0.0083171	total: 15.9s	remaining: 18.2s
128:	learn: 0.0082474	total: 15.9s	remaining: 18s
129:	learn: 0.0081941	total: 16s	remaining: 17.9s
130:	learn: 0.0080728	total: 16.1s	remaining: 17.7s
131:	learn: 0.0080049	total: 16.2s	remaining: 17.5s
132:	learn: 0.0079310	total: 16.3s	remaining: 17.4s
133:	learn: 0.0078621	total: 16.4s	remaining: 17.2s
134:	learn: 0.0077760	total: 16.4s	remaining: 17s
135:	learn: 0.0076671	total: 16.5s	remaining: 16.9s
136:	learn: 0.0076015	total: 16.6s	remaining: 16.7s
137:	learn: 0.0075084	total: 16.7s	remaining: 16.6s
138:	learn: 0.0074344	total: 16.8s	remaining: 16.4s
139:	learn: 0.0073603	total: 16.9s	remaining: 16.3s
140:	learn: 0.0073139	total: 16.9s	remaining: 16.1s
141:	learn: 0.0072173	total: 17s	remaining: 16s
142:	learn: 0.0071176	total: 17.1s	remaining: 15.8s
143:	learn: 0.0070568	total: 17.2s	remaining: 15.7s
144:	learn: 0.0069763	total: 17.3s	remaining: 15.5s
145:	learn: 0.0069474	total: 17.4s	remaining: 15.3s
146:	learn: 0.0069172	total: 17.5s	remaining: 15.2s
147:	learn: 0.0067897	total: 17.5s	remaining: 15.1s
148:	learn: 0.0067044	total: 17.6s	remaining: 14.9s
149:	learn: 0.0066269	total: 17.7s	remaining: 14.8s
150:	learn: 0.0065536	total: 17.8s	remaining: 14.6s
151:	learn: 0.0065029	total: 17.9s	remaining: 14.5s
152:	learn: 0.0064233	total: 18s	remaining: 14.3s
153:	learn: 0.0063643	total: 18.1s	remaining: 14.2s
154:	learn: 0.0063215	total: 18.2s	remaining: 14.1s
155:	learn: 0.0062870	total: 18.2s	remaining: 13.9s
156:	learn: 0.0062250	total: 18.3s	remaining: 13.8s
157:	learn: 0.0061474	total: 18.4s	remaining: 13.6s
158:	learn: 0.0061162	total: 18.5s	remaining: 13.5s
159:	learn: 0.0060386	total: 18.6s	remaining: 13.4s
160:	learn: 0.0060206	total: 18.7s	remaining: 13.2s
161:	learn: 0.0059754	total: 18.8s	remaining: 13.1s
162:	learn: 0.0058923	total: 18.8s	remaining: 12.9s
163:	learn: 0.0058663	total: 18.9s	remaining: 12.8s
164:	learn: 0.0058462	total: 19s	remaining: 12.7s
165:	learn: 0.0057865	total: 19.1s	remaining: 12.5s
166:	learn: 0.0057604	total: 19.2s	remaining: 12.4s
167:	learn: 0.0057022	total: 19.2s	remaining: 12.3s
168:	learn: 0.0056484	total: 19.3s	remaining: 12.1s
169:	learn: 0.0055987	total: 19.4s	remaining: 12s
170:	learn: 0.0055405	total: 19.5s	remaining: 11.9s
171:	learn: 0.0054924	total: 19.6s	remaining: 11.7s
172:	learn: 0.0054556	total: 19.7s	remaining: 11.6s
173:	learn: 0.0054345	total: 19.7s	remaining: 11.5s
174:	learn: 0.0054256	total: 19.8s	remaining: 11.3s
175:	learn: 0.0053797	total: 19.9s	remaining: 11.2s
176:	learn: 0.0053311	total: 20s	remaining: 11.1s
177:	learn: 0.0052744	total: 20.1s	remaining: 10.9s
178:	learn: 0.0052341	total: 20.2s	remaining: 10.8s
179:	learn: 0.0051997	total: 20.2s	remaining: 10.7s
180:	learn: 0.0051599	total: 20.3s	remaining: 10.6s
181:	learn: 0.0051253	total: 20.4s	remaining: 10.4s
182:	learn: 0.0050658	total: 20.5s	remaining: 10.3s
183:	learn: 0.0050229	total: 20.6s	remaining: 10.2s
184:	learn: 0.0049556	total: 20.7s	remaining: 10.1s
185:	learn: 0.0049382	total: 20.7s	remaining: 9.93s
186:	learn: 0.0049168	total: 20.8s	remaining: 9.81s
187:	learn: 0.0049169	total: 20.9s	remaining: 9.68s
188:	learn: 0.0048876	total: 21s	remaining: 9.55s
189:	learn: 0.0048680	total: 21.1s	remaining: 9.42s
190:	learn: 0.0048140	total: 21.2s	remaining: 9.3s
191:	learn: 0.0047630	total: 21.3s	remaining: 9.19s
192:	learn: 0.0047267	total: 21.4s	remaining: 9.07s
193:	learn: 0.0046774	total: 21.4s	remaining: 8.96s
194:	learn: 0.0046636	total: 21.5s	remaining: 8.83s
195:	learn: 0.0046376	total: 21.6s	remaining: 8.71s
196:	learn: 0.0046376	total: 21.7s	remaining: 8.58s
197:	learn: 0.0046377	total: 21.7s	remaining: 8.45s
198:	learn: 0.0046376	total: 21.8s	remaining: 8.33s
199:	learn: 0.0046376	total: 21.9s	remaining: 8.22s
200:	learn: 0.0046376	total: 22s	remaining: 8.09s
201:	learn: 0.0046376	total: 22.1s	remaining: 7.97s
202:	learn: 0.0046376	total: 22.1s	remaining: 7.85s
203:	learn: 0.0045957	total: 22.2s	remaining: 7.73s
204:	learn: 0.0045461	total: 22.3s	remaining: 7.62s
205:	learn: 0.0045002	total: 22.4s	remaining: 7.5s
206:	learn: 0.0044840	total: 22.5s	remaining: 7.4s
207:	learn: 0.0044411	total: 22.7s	remaining: 7.31s
208:	learn: 0.0044411	total: 22.8s	remaining: 7.21s
209:	learn: 0.0044411	total: 22.9s	remaining: 7.1s
210:	learn: 0.0044215	total: 23.1s	remaining: 7.01s
211:	learn: 0.0043738	total: 23.2s	remaining: 6.91s
212:	learn: 0.0042997	total: 23.4s	remaining: 6.81s
213:	learn: 0.0042351	total: 23.6s	remaining: 6.72s
214:	learn: 0.0042196	total: 23.7s	remaining: 6.62s
215:	learn: 0.0042196	total: 23.8s	remaining: 6.51s
216:	learn: 0.0042195	total: 24s	remaining: 6.41s
217:	learn: 0.0042195	total: 24.1s	remaining: 6.3s
218:	learn: 0.0042195	total: 24.2s	remaining: 6.19s
219:	learn: 0.0042195	total: 24.3s	remaining: 6.09s
220:	learn: 0.0042195	total: 24.5s	remaining: 5.98s
221:	learn: 0.0042194	total: 24.6s	remaining: 5.87s
222:	learn: 0.0042194	total: 24.7s	remaining: 5.76s
223:	learn: 0.0042195	total: 24.8s	remaining: 5.65s
224:	learn: 0.0042195	total: 25s	remaining: 5.55s
225:	learn: 0.0042195	total: 25.1s	remaining: 5.44s
226:	learn: 0.0042194	total: 25.2s	remaining: 5.33s
227:	learn: 0.0042194	total: 25.3s	remaining: 5.22s
228:	learn: 0.0042194	total: 25.5s	remaining: 5.12s
229:	learn: 0.0042194	total: 25.6s	remaining: 5s
230:	learn: 0.0042194	total: 25.7s	remaining: 4.89s
231:	learn: 0.0042194	total: 25.8s	remaining: 4.78s
232:	learn: 0.0042194	total: 25.9s	remaining: 4.67s
233:	learn: 0.0042194	total: 26.1s	remaining: 4.57s
234:	learn: 0.0042194	total: 26.2s	remaining: 4.46s
235:	learn: 0.0042194	total: 26.3s	remaining: 4.35s
236:	learn: 0.0042194	total: 26.4s	remaining: 4.24s
237:	learn: 0.0042193	total: 26.6s	remaining: 4.13s
238:	learn: 0.0042193	total: 26.7s	remaining: 4.02s
239:	learn: 0.0042193	total: 26.8s	remaining: 3.91s
240:	learn: 0.0042193	total: 26.9s	remaining: 3.8s
241:	learn: 0.0042193	total: 27.1s	remaining: 3.69s
242:	learn: 0.0042193	total: 27.2s	remaining: 3.58s
243:	learn: 0.0042193	total: 27.3s	remaining: 3.47s
244:	learn: 0.0042193	total: 27.4s	remaining: 3.36s
245:	learn: 0.0042193	total: 27.6s	remaining: 3.25s
246:	learn: 0.0042193	total: 27.7s	remaining: 3.14s
247:	learn: 0.0042064	total: 27.8s	remaining: 3.03s
248:	learn: 0.0041833	total: 28s	remaining: 2.92s
249:	learn: 0.0041833	total: 28.1s	remaining: 2.81s
250:	learn: 0.0041833	total: 28.2s	remaining: 2.7s
251:	learn: 0.0041833	total: 28.3s	remaining: 2.58s
252:	learn: 0.0041832	total: 28.4s	remaining: 2.47s
253:	learn: 0.0041831	total: 28.5s	remaining: 2.35s
254:	learn: 0.0041831	total: 28.5s	remaining: 2.24s
255:	learn: 0.0041831	total: 28.6s	remaining: 2.12s
256:	learn: 0.0041831	total: 28.6s	remaining: 2.01s
257:	learn: 0.0041831	total: 28.7s	remaining: 1.89s
258:	learn: 0.0041831	total: 28.8s	remaining: 1.78s
259:	learn: 0.0041831	total: 28.8s	remaining: 1.66s
260:	learn: 0.0041831	total: 28.9s	remaining: 1.55s
261:	learn: 0.0041831	total: 29s	remaining: 1.44s
262:	learn: 0.0041831	total: 29s	remaining: 1.32s
263:	learn: 0.0041831	total: 29.1s	remaining: 1.21s
264:	learn: 0.0041639	total: 29.2s	remaining: 1.1s
265:	learn: 0.0041639	total: 29.3s	remaining: 990ms
266:	learn: 0.0041638	total: 29.3s	remaining: 879ms
267:	learn: 0.0041638	total: 29.4s	remaining: 768ms
268:	learn: 0.0041638	total: 29.5s	remaining: 657ms
269:	learn: 0.0041637	total: 29.5s	remaining: 547ms
270:	learn: 0.0041637	total: 29.6s	remaining: 437ms
271:	learn: 0.0041637	total: 29.7s	remaining: 327ms
272:	learn: 0.0041637	total: 29.7s	remaining: 218ms
273:	learn: 0.0041637	total: 29.8s	remaining: 109ms
274:	learn: 0.0041457	total: 29.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.34
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.67
 - F1-Score_Train: 99.67
 - Precision_Test: 17.28
 - Recall_Test: 88.89
 - AUPRC_Test: 71.87
 - Accuracy_Test: 99.27
 - F1-Score_Test: 28.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.59
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5083078	total: 75.2ms	remaining: 20.6s
1:	learn: 0.3629067	total: 153ms	remaining: 20.9s
2:	learn: 0.2737336	total: 230ms	remaining: 20.8s
3:	learn: 0.2221708	total: 324ms	remaining: 22s
4:	learn: 0.1835073	total: 399ms	remaining: 21.5s
5:	learn: 0.1542413	total: 490ms	remaining: 22s
6:	learn: 0.1263563	total: 584ms	remaining: 22.4s
7:	learn: 0.1106819	total: 660ms	remaining: 22s
8:	learn: 0.1006102	total: 735ms	remaining: 21.7s
9:	learn: 0.0922479	total: 834ms	remaining: 22.1s
10:	learn: 0.0847897	total: 916ms	remaining: 22s
11:	learn: 0.0786353	total: 996ms	remaining: 21.8s
12:	learn: 0.0724930	total: 1.09s	remaining: 21.9s
13:	learn: 0.0687340	total: 1.17s	remaining: 21.8s
14:	learn: 0.0655476	total: 1.25s	remaining: 21.6s
15:	learn: 0.0628084	total: 1.36s	remaining: 22s
16:	learn: 0.0607302	total: 1.44s	remaining: 21.8s
17:	learn: 0.0586675	total: 1.52s	remaining: 21.7s
18:	learn: 0.0553876	total: 1.62s	remaining: 21.8s
19:	learn: 0.0529262	total: 1.7s	remaining: 21.7s
20:	learn: 0.0500266	total: 1.78s	remaining: 21.5s
21:	learn: 0.0474068	total: 1.88s	remaining: 21.6s
22:	learn: 0.0461200	total: 1.96s	remaining: 21.4s
23:	learn: 0.0441983	total: 2.03s	remaining: 21.2s
24:	learn: 0.0427051	total: 2.12s	remaining: 21.2s
25:	learn: 0.0414324	total: 2.2s	remaining: 21s
26:	learn: 0.0397793	total: 2.29s	remaining: 21s
27:	learn: 0.0391176	total: 2.38s	remaining: 21s
28:	learn: 0.0381986	total: 2.46s	remaining: 20.9s
29:	learn: 0.0369777	total: 2.55s	remaining: 20.8s
30:	learn: 0.0358185	total: 2.65s	remaining: 20.9s
31:	learn: 0.0349473	total: 2.73s	remaining: 20.7s
32:	learn: 0.0340829	total: 2.81s	remaining: 20.6s
33:	learn: 0.0332267	total: 2.9s	remaining: 20.6s
34:	learn: 0.0321097	total: 2.99s	remaining: 20.5s
35:	learn: 0.0315409	total: 3.06s	remaining: 20.3s
36:	learn: 0.0309526	total: 3.14s	remaining: 20.2s
37:	learn: 0.0304025	total: 3.25s	remaining: 20.3s
38:	learn: 0.0294858	total: 3.33s	remaining: 20.1s
39:	learn: 0.0287547	total: 3.43s	remaining: 20.2s
40:	learn: 0.0283027	total: 3.53s	remaining: 20.1s
41:	learn: 0.0276827	total: 3.61s	remaining: 20s
42:	learn: 0.0268713	total: 3.72s	remaining: 20.1s
43:	learn: 0.0261306	total: 3.8s	remaining: 20s
44:	learn: 0.0255621	total: 3.88s	remaining: 19.8s
45:	learn: 0.0247877	total: 4s	remaining: 19.9s
46:	learn: 0.0241871	total: 4.08s	remaining: 19.8s
47:	learn: 0.0237023	total: 4.16s	remaining: 19.7s
48:	learn: 0.0234595	total: 4.25s	remaining: 19.6s
49:	learn: 0.0230075	total: 4.33s	remaining: 19.5s
50:	learn: 0.0225465	total: 4.41s	remaining: 19.4s
51:	learn: 0.0220278	total: 4.51s	remaining: 19.3s
52:	learn: 0.0215403	total: 4.6s	remaining: 19.3s
53:	learn: 0.0211587	total: 4.68s	remaining: 19.2s
54:	learn: 0.0205916	total: 4.79s	remaining: 19.1s
55:	learn: 0.0202673	total: 4.87s	remaining: 19s
56:	learn: 0.0198902	total: 4.94s	remaining: 18.9s
57:	learn: 0.0195097	total: 5.04s	remaining: 18.9s
58:	learn: 0.0191370	total: 5.11s	remaining: 18.7s
59:	learn: 0.0187649	total: 5.2s	remaining: 18.6s
60:	learn: 0.0184230	total: 5.29s	remaining: 18.6s
61:	learn: 0.0180640	total: 5.39s	remaining: 18.5s
62:	learn: 0.0177768	total: 5.46s	remaining: 18.4s
63:	learn: 0.0175189	total: 5.58s	remaining: 18.4s
64:	learn: 0.0172666	total: 5.66s	remaining: 18.3s
65:	learn: 0.0169370	total: 5.73s	remaining: 18.2s
66:	learn: 0.0167957	total: 5.83s	remaining: 18.1s
67:	learn: 0.0164740	total: 5.91s	remaining: 18s
68:	learn: 0.0162700	total: 5.99s	remaining: 17.9s
69:	learn: 0.0159931	total: 6.08s	remaining: 17.8s
70:	learn: 0.0156870	total: 6.16s	remaining: 17.7s
71:	learn: 0.0153637	total: 6.24s	remaining: 17.6s
72:	learn: 0.0151666	total: 6.37s	remaining: 17.6s
73:	learn: 0.0149625	total: 6.49s	remaining: 17.6s
74:	learn: 0.0147404	total: 6.65s	remaining: 17.7s
75:	learn: 0.0145970	total: 6.78s	remaining: 17.7s
76:	learn: 0.0144694	total: 6.92s	remaining: 17.8s
77:	learn: 0.0142953	total: 7.08s	remaining: 17.9s
78:	learn: 0.0139538	total: 7.25s	remaining: 18s
79:	learn: 0.0136440	total: 7.39s	remaining: 18s
80:	learn: 0.0133685	total: 7.55s	remaining: 18.1s
81:	learn: 0.0132085	total: 7.7s	remaining: 18.1s
82:	learn: 0.0130335	total: 7.88s	remaining: 18.2s
83:	learn: 0.0129219	total: 8.03s	remaining: 18.3s
84:	learn: 0.0127485	total: 8.19s	remaining: 18.3s
85:	learn: 0.0124946	total: 8.33s	remaining: 18.3s
86:	learn: 0.0123160	total: 8.52s	remaining: 18.4s
87:	learn: 0.0122105	total: 8.67s	remaining: 18.4s
88:	learn: 0.0120971	total: 8.85s	remaining: 18.5s
89:	learn: 0.0119689	total: 8.99s	remaining: 18.5s
90:	learn: 0.0118054	total: 9.14s	remaining: 18.5s
91:	learn: 0.0116150	total: 9.3s	remaining: 18.5s
92:	learn: 0.0114592	total: 9.49s	remaining: 18.6s
93:	learn: 0.0113297	total: 9.65s	remaining: 18.6s
94:	learn: 0.0111878	total: 9.81s	remaining: 18.6s
95:	learn: 0.0110350	total: 9.96s	remaining: 18.6s
96:	learn: 0.0108168	total: 10.1s	remaining: 18.6s
97:	learn: 0.0106709	total: 10.3s	remaining: 18.5s
98:	learn: 0.0105471	total: 10.5s	remaining: 18.6s
99:	learn: 0.0103592	total: 10.6s	remaining: 18.6s
100:	learn: 0.0102054	total: 10.8s	remaining: 18.6s
101:	learn: 0.0101143	total: 11s	remaining: 18.6s
102:	learn: 0.0099984	total: 11.1s	remaining: 18.6s
103:	learn: 0.0099335	total: 11.3s	remaining: 18.5s
104:	learn: 0.0098638	total: 11.4s	remaining: 18.5s
105:	learn: 0.0098118	total: 11.6s	remaining: 18.4s
106:	learn: 0.0096587	total: 11.8s	remaining: 18.5s
107:	learn: 0.0095183	total: 11.9s	remaining: 18.4s
108:	learn: 0.0093958	total: 12s	remaining: 18.2s
109:	learn: 0.0092814	total: 12.1s	remaining: 18.1s
110:	learn: 0.0091470	total: 12.1s	remaining: 17.9s
111:	learn: 0.0090933	total: 12.2s	remaining: 17.8s
112:	learn: 0.0089341	total: 12.3s	remaining: 17.6s
113:	learn: 0.0088574	total: 12.4s	remaining: 17.5s
114:	learn: 0.0087599	total: 12.5s	remaining: 17.4s
115:	learn: 0.0086438	total: 12.6s	remaining: 17.2s
116:	learn: 0.0085510	total: 12.6s	remaining: 17.1s
117:	learn: 0.0084369	total: 12.7s	remaining: 16.9s
118:	learn: 0.0083109	total: 12.8s	remaining: 16.8s
119:	learn: 0.0081702	total: 12.9s	remaining: 16.7s
120:	learn: 0.0080145	total: 13s	remaining: 16.6s
121:	learn: 0.0079065	total: 13.1s	remaining: 16.4s
122:	learn: 0.0078344	total: 13.2s	remaining: 16.3s
123:	learn: 0.0077997	total: 13.3s	remaining: 16.1s
124:	learn: 0.0076805	total: 13.4s	remaining: 16s
125:	learn: 0.0076171	total: 13.4s	remaining: 15.9s
126:	learn: 0.0074902	total: 13.5s	remaining: 15.8s
127:	learn: 0.0074081	total: 13.6s	remaining: 15.6s
128:	learn: 0.0073432	total: 13.7s	remaining: 15.5s
129:	learn: 0.0072165	total: 13.8s	remaining: 15.4s
130:	learn: 0.0071021	total: 13.9s	remaining: 15.2s
131:	learn: 0.0070382	total: 14s	remaining: 15.1s
132:	learn: 0.0069590	total: 14s	remaining: 15s
133:	learn: 0.0068542	total: 14.1s	remaining: 14.9s
134:	learn: 0.0068049	total: 14.2s	remaining: 14.7s
135:	learn: 0.0067272	total: 14.3s	remaining: 14.6s
136:	learn: 0.0066298	total: 14.4s	remaining: 14.5s
137:	learn: 0.0066102	total: 14.5s	remaining: 14.4s
138:	learn: 0.0065669	total: 14.5s	remaining: 14.2s
139:	learn: 0.0064652	total: 14.6s	remaining: 14.1s
140:	learn: 0.0063973	total: 14.7s	remaining: 14s
141:	learn: 0.0063230	total: 14.8s	remaining: 13.8s
142:	learn: 0.0062620	total: 14.9s	remaining: 13.7s
143:	learn: 0.0062087	total: 15s	remaining: 13.6s
144:	learn: 0.0061545	total: 15s	remaining: 13.5s
145:	learn: 0.0061040	total: 15.1s	remaining: 13.4s
146:	learn: 0.0060721	total: 15.2s	remaining: 13.2s
147:	learn: 0.0060177	total: 15.3s	remaining: 13.1s
148:	learn: 0.0059650	total: 15.4s	remaining: 13s
149:	learn: 0.0059216	total: 15.5s	remaining: 12.9s
150:	learn: 0.0058524	total: 15.5s	remaining: 12.8s
151:	learn: 0.0058287	total: 15.6s	remaining: 12.7s
152:	learn: 0.0057107	total: 15.7s	remaining: 12.5s
153:	learn: 0.0056974	total: 15.8s	remaining: 12.4s
154:	learn: 0.0056276	total: 15.9s	remaining: 12.3s
155:	learn: 0.0056030	total: 16s	remaining: 12.2s
156:	learn: 0.0055556	total: 16.1s	remaining: 12.1s
157:	learn: 0.0055000	total: 16.2s	remaining: 12s
158:	learn: 0.0054195	total: 16.3s	remaining: 11.9s
159:	learn: 0.0053884	total: 16.3s	remaining: 11.7s
160:	learn: 0.0053268	total: 16.4s	remaining: 11.6s
161:	learn: 0.0052674	total: 16.5s	remaining: 11.5s
162:	learn: 0.0052259	total: 16.6s	remaining: 11.4s
163:	learn: 0.0051966	total: 16.7s	remaining: 11.3s
164:	learn: 0.0051317	total: 16.8s	remaining: 11.2s
165:	learn: 0.0050533	total: 16.9s	remaining: 11.1s
166:	learn: 0.0050117	total: 16.9s	remaining: 11s
167:	learn: 0.0049582	total: 17s	remaining: 10.9s
168:	learn: 0.0049264	total: 17.1s	remaining: 10.7s
169:	learn: 0.0049064	total: 17.2s	remaining: 10.6s
170:	learn: 0.0048521	total: 17.3s	remaining: 10.5s
171:	learn: 0.0048521	total: 17.4s	remaining: 10.4s
172:	learn: 0.0048037	total: 17.5s	remaining: 10.3s
173:	learn: 0.0047568	total: 17.5s	remaining: 10.2s
174:	learn: 0.0046905	total: 17.6s	remaining: 10.1s
175:	learn: 0.0046905	total: 17.7s	remaining: 9.95s
176:	learn: 0.0046757	total: 17.8s	remaining: 9.83s
177:	learn: 0.0046310	total: 17.8s	remaining: 9.72s
178:	learn: 0.0045780	total: 17.9s	remaining: 9.63s
179:	learn: 0.0045589	total: 18s	remaining: 9.51s
180:	learn: 0.0045378	total: 18.1s	remaining: 9.4s
181:	learn: 0.0044885	total: 18.2s	remaining: 9.3s
182:	learn: 0.0044884	total: 18.3s	remaining: 9.18s
183:	learn: 0.0044504	total: 18.4s	remaining: 9.08s
184:	learn: 0.0044504	total: 18.4s	remaining: 8.96s
185:	learn: 0.0044333	total: 18.5s	remaining: 8.86s
186:	learn: 0.0043947	total: 18.6s	remaining: 8.75s
187:	learn: 0.0043549	total: 18.7s	remaining: 8.64s
188:	learn: 0.0043058	total: 18.8s	remaining: 8.54s
189:	learn: 0.0042696	total: 18.8s	remaining: 8.43s
190:	learn: 0.0042696	total: 18.9s	remaining: 8.31s
191:	learn: 0.0042696	total: 19s	remaining: 8.21s
192:	learn: 0.0042579	total: 19.1s	remaining: 8.1s
193:	learn: 0.0042198	total: 19.2s	remaining: 8s
194:	learn: 0.0041519	total: 19.2s	remaining: 7.89s
195:	learn: 0.0041377	total: 19.3s	remaining: 7.79s
196:	learn: 0.0041109	total: 19.4s	remaining: 7.68s
197:	learn: 0.0040597	total: 19.5s	remaining: 7.58s
198:	learn: 0.0040355	total: 19.6s	remaining: 7.47s
199:	learn: 0.0040200	total: 19.6s	remaining: 7.37s
200:	learn: 0.0039855	total: 19.7s	remaining: 7.26s
201:	learn: 0.0039594	total: 19.8s	remaining: 7.16s
202:	learn: 0.0039256	total: 19.9s	remaining: 7.06s
203:	learn: 0.0038980	total: 20s	remaining: 6.95s
204:	learn: 0.0038523	total: 20.1s	remaining: 6.85s
205:	learn: 0.0038317	total: 20.2s	remaining: 6.75s
206:	learn: 0.0038317	total: 20.2s	remaining: 6.64s
207:	learn: 0.0038317	total: 20.3s	remaining: 6.53s
208:	learn: 0.0038233	total: 20.4s	remaining: 6.43s
209:	learn: 0.0037958	total: 20.5s	remaining: 6.33s
210:	learn: 0.0037958	total: 20.5s	remaining: 6.23s
211:	learn: 0.0037957	total: 20.6s	remaining: 6.12s
212:	learn: 0.0037957	total: 20.7s	remaining: 6.01s
213:	learn: 0.0037957	total: 20.7s	remaining: 5.91s
214:	learn: 0.0037957	total: 20.8s	remaining: 5.81s
215:	learn: 0.0037956	total: 20.9s	remaining: 5.7s
216:	learn: 0.0037955	total: 21s	remaining: 5.6s
217:	learn: 0.0037956	total: 21s	remaining: 5.5s
218:	learn: 0.0037955	total: 21.1s	remaining: 5.4s
219:	learn: 0.0037954	total: 21.2s	remaining: 5.3s
220:	learn: 0.0037955	total: 21.3s	remaining: 5.19s
221:	learn: 0.0037954	total: 21.3s	remaining: 5.09s
222:	learn: 0.0037954	total: 21.4s	remaining: 4.99s
223:	learn: 0.0037954	total: 21.5s	remaining: 4.89s
224:	learn: 0.0037792	total: 21.6s	remaining: 4.79s
225:	learn: 0.0037793	total: 21.6s	remaining: 4.69s
226:	learn: 0.0037510	total: 21.7s	remaining: 4.59s
227:	learn: 0.0037510	total: 21.8s	remaining: 4.49s
228:	learn: 0.0037508	total: 21.9s	remaining: 4.39s
229:	learn: 0.0037509	total: 22s	remaining: 4.3s
230:	learn: 0.0037508	total: 22.1s	remaining: 4.2s
231:	learn: 0.0037509	total: 22.2s	remaining: 4.11s
232:	learn: 0.0037383	total: 22.4s	remaining: 4.03s
233:	learn: 0.0037047	total: 22.5s	remaining: 3.94s
234:	learn: 0.0036744	total: 22.7s	remaining: 3.86s
235:	learn: 0.0036533	total: 22.8s	remaining: 3.77s
236:	learn: 0.0036287	total: 23s	remaining: 3.69s
237:	learn: 0.0036288	total: 23.1s	remaining: 3.59s
238:	learn: 0.0036287	total: 23.3s	remaining: 3.5s
239:	learn: 0.0036287	total: 23.4s	remaining: 3.41s
240:	learn: 0.0036288	total: 23.5s	remaining: 3.31s
241:	learn: 0.0036288	total: 23.6s	remaining: 3.22s
242:	learn: 0.0036288	total: 23.7s	remaining: 3.13s
243:	learn: 0.0036288	total: 23.9s	remaining: 3.03s
244:	learn: 0.0036288	total: 24s	remaining: 2.94s
245:	learn: 0.0036288	total: 24.1s	remaining: 2.84s
246:	learn: 0.0036288	total: 24.2s	remaining: 2.75s
247:	learn: 0.0036288	total: 24.4s	remaining: 2.65s
248:	learn: 0.0036288	total: 24.5s	remaining: 2.56s
249:	learn: 0.0036288	total: 24.6s	remaining: 2.46s
250:	learn: 0.0036288	total: 24.8s	remaining: 2.37s
251:	learn: 0.0036287	total: 24.9s	remaining: 2.27s
252:	learn: 0.0036287	total: 25s	remaining: 2.17s
253:	learn: 0.0036287	total: 25.1s	remaining: 2.08s
254:	learn: 0.0036287	total: 25.3s	remaining: 1.98s
255:	learn: 0.0036115	total: 25.4s	remaining: 1.89s
256:	learn: 0.0036115	total: 25.5s	remaining: 1.79s
257:	learn: 0.0036115	total: 25.6s	remaining: 1.69s
258:	learn: 0.0036115	total: 25.8s	remaining: 1.59s
259:	learn: 0.0036115	total: 25.9s	remaining: 1.49s
260:	learn: 0.0036114	total: 26s	remaining: 1.4s
261:	learn: 0.0036114	total: 26.1s	remaining: 1.3s
262:	learn: 0.0036114	total: 26.3s	remaining: 1.2s
263:	learn: 0.0036114	total: 26.4s	remaining: 1.1s
264:	learn: 0.0036114	total: 26.5s	remaining: 1s
265:	learn: 0.0036114	total: 26.7s	remaining: 902ms
266:	learn: 0.0036114	total: 26.8s	remaining: 803ms
267:	learn: 0.0036114	total: 26.9s	remaining: 703ms
268:	learn: 0.0036114	total: 27s	remaining: 603ms
269:	learn: 0.0036114	total: 27.1s	remaining: 503ms
270:	learn: 0.0035930	total: 27.3s	remaining: 403ms
271:	learn: 0.0035930	total: 27.4s	remaining: 302ms
272:	learn: 0.0035930	total: 27.5s	remaining: 201ms
273:	learn: 0.0035930	total: 27.5s	remaining: 101ms
274:	learn: 0.0035930	total: 27.6s	remaining: 0us
[I 2024-12-19 14:34:17,964] Trial 22 finished with value: 74.79473450830527 and parameters: {'learning_rate': 0.07738097901544849, 'max_depth': 5, 'n_estimators': 275, 'scale_pos_weight': 9.58893520201917}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.36
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.68
 - F1-Score_Train: 99.68
 - Precision_Test: 17.08
 - Recall_Test: 86.51
 - AUPRC_Test: 74.94
 - Accuracy_Test: 99.27
 - F1-Score_Test: 28.53
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 275
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.59
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 74.7947

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5753285	total: 76.9ms	remaining: 18.5s
1:	learn: 0.4653052	total: 154ms	remaining: 18.4s
2:	learn: 0.3834945	total: 235ms	remaining: 18.7s
3:	learn: 0.3175096	total: 331ms	remaining: 19.7s
4:	learn: 0.2652700	total: 419ms	remaining: 19.9s
5:	learn: 0.2288249	total: 504ms	remaining: 19.8s
6:	learn: 0.1927140	total: 608ms	remaining: 20.4s
7:	learn: 0.1681305	total: 691ms	remaining: 20.2s
8:	learn: 0.1459056	total: 780ms	remaining: 20.2s
9:	learn: 0.1300621	total: 875ms	remaining: 20.3s
10:	learn: 0.1203071	total: 952ms	remaining: 20s
11:	learn: 0.1127846	total: 1.03s	remaining: 19.8s
12:	learn: 0.1033921	total: 1.12s	remaining: 19.8s
13:	learn: 0.0971516	total: 1.2s	remaining: 19.6s
14:	learn: 0.0912925	total: 1.29s	remaining: 19.5s
15:	learn: 0.0872635	total: 1.38s	remaining: 19.5s
16:	learn: 0.0822695	total: 1.46s	remaining: 19.4s
17:	learn: 0.0751710	total: 1.54s	remaining: 19.2s
18:	learn: 0.0716877	total: 1.63s	remaining: 19.2s
19:	learn: 0.0680853	total: 1.72s	remaining: 19s
20:	learn: 0.0651893	total: 1.8s	remaining: 18.9s
21:	learn: 0.0623008	total: 1.89s	remaining: 18.9s
22:	learn: 0.0590210	total: 1.98s	remaining: 18.8s
23:	learn: 0.0573003	total: 2.06s	remaining: 18.8s
24:	learn: 0.0552767	total: 2.16s	remaining: 18.7s
25:	learn: 0.0535732	total: 2.24s	remaining: 18.6s
26:	learn: 0.0516545	total: 2.32s	remaining: 18.5s
27:	learn: 0.0497429	total: 2.44s	remaining: 18.7s
28:	learn: 0.0479888	total: 2.53s	remaining: 18.6s
29:	learn: 0.0465975	total: 2.6s	remaining: 18.4s
30:	learn: 0.0457434	total: 2.7s	remaining: 18.4s
31:	learn: 0.0445094	total: 2.8s	remaining: 18.4s
32:	learn: 0.0434826	total: 2.87s	remaining: 18.2s
33:	learn: 0.0428185	total: 2.98s	remaining: 18.2s
34:	learn: 0.0417661	total: 3.06s	remaining: 18.1s
35:	learn: 0.0408605	total: 3.14s	remaining: 18s
36:	learn: 0.0400018	total: 3.23s	remaining: 17.9s
37:	learn: 0.0391333	total: 3.34s	remaining: 17.9s
38:	learn: 0.0382909	total: 3.42s	remaining: 17.8s
39:	learn: 0.0374539	total: 3.51s	remaining: 17.7s
40:	learn: 0.0367877	total: 3.58s	remaining: 17.6s
41:	learn: 0.0360984	total: 3.66s	remaining: 17.4s
42:	learn: 0.0352241	total: 3.75s	remaining: 17.4s
43:	learn: 0.0344083	total: 3.85s	remaining: 17.3s
44:	learn: 0.0337187	total: 3.92s	remaining: 17.2s
45:	learn: 0.0330851	total: 4.02s	remaining: 17.1s
46:	learn: 0.0325846	total: 4.1s	remaining: 17s
47:	learn: 0.0317095	total: 4.18s	remaining: 16.9s
48:	learn: 0.0311635	total: 4.28s	remaining: 16.8s
49:	learn: 0.0305801	total: 4.36s	remaining: 16.7s
50:	learn: 0.0300037	total: 4.44s	remaining: 16.6s
51:	learn: 0.0295957	total: 4.53s	remaining: 16.6s
52:	learn: 0.0291943	total: 4.6s	remaining: 16.4s
53:	learn: 0.0287394	total: 4.67s	remaining: 16.3s
54:	learn: 0.0282434	total: 4.76s	remaining: 16.2s
55:	learn: 0.0277322	total: 4.86s	remaining: 16.1s
56:	learn: 0.0273850	total: 4.93s	remaining: 16s
57:	learn: 0.0270853	total: 5.02s	remaining: 15.9s
58:	learn: 0.0267320	total: 5.11s	remaining: 15.8s
59:	learn: 0.0261100	total: 5.18s	remaining: 15.7s
60:	learn: 0.0258414	total: 5.28s	remaining: 15.7s
61:	learn: 0.0255089	total: 5.36s	remaining: 15.5s
62:	learn: 0.0252174	total: 5.44s	remaining: 15.5s
63:	learn: 0.0248876	total: 5.54s	remaining: 15.4s
64:	learn: 0.0246530	total: 5.62s	remaining: 15.3s
65:	learn: 0.0242277	total: 5.7s	remaining: 15.2s
66:	learn: 0.0240097	total: 5.79s	remaining: 15.1s
67:	learn: 0.0236512	total: 5.89s	remaining: 15.1s
68:	learn: 0.0233788	total: 5.97s	remaining: 15s
69:	learn: 0.0230633	total: 6.06s	remaining: 14.9s
70:	learn: 0.0227233	total: 6.14s	remaining: 14.8s
71:	learn: 0.0224425	total: 6.21s	remaining: 14.7s
72:	learn: 0.0221079	total: 6.31s	remaining: 14.6s
73:	learn: 0.0218043	total: 6.39s	remaining: 14.5s
74:	learn: 0.0215070	total: 6.47s	remaining: 14.4s
75:	learn: 0.0212217	total: 6.56s	remaining: 14.3s
76:	learn: 0.0208123	total: 6.64s	remaining: 14.2s
77:	learn: 0.0205166	total: 6.72s	remaining: 14.1s
78:	learn: 0.0202681	total: 6.81s	remaining: 14.1s
79:	learn: 0.0199980	total: 6.9s	remaining: 14s
80:	learn: 0.0195368	total: 6.98s	remaining: 13.9s
81:	learn: 0.0192201	total: 7.08s	remaining: 13.8s
82:	learn: 0.0189270	total: 7.16s	remaining: 13.7s
83:	learn: 0.0187669	total: 7.23s	remaining: 13.6s
84:	learn: 0.0184725	total: 7.32s	remaining: 13.5s
85:	learn: 0.0182619	total: 7.4s	remaining: 13.4s
86:	learn: 0.0180444	total: 7.48s	remaining: 13.3s
87:	learn: 0.0178074	total: 7.59s	remaining: 13.3s
88:	learn: 0.0176200	total: 7.72s	remaining: 13.3s
89:	learn: 0.0174373	total: 7.87s	remaining: 13.3s
90:	learn: 0.0172242	total: 8s	remaining: 13.3s
91:	learn: 0.0169094	total: 8.18s	remaining: 13.3s
92:	learn: 0.0167732	total: 8.33s	remaining: 13.4s
93:	learn: 0.0165433	total: 8.49s	remaining: 13.4s
94:	learn: 0.0163665	total: 8.65s	remaining: 13.4s
95:	learn: 0.0161795	total: 8.81s	remaining: 13.4s
96:	learn: 0.0159704	total: 8.97s	remaining: 13.4s
97:	learn: 0.0157654	total: 9.13s	remaining: 13.4s
98:	learn: 0.0156368	total: 9.27s	remaining: 13.4s
99:	learn: 0.0153210	total: 9.42s	remaining: 13.4s
100:	learn: 0.0151462	total: 9.55s	remaining: 13.3s
101:	learn: 0.0150183	total: 9.71s	remaining: 13.3s
102:	learn: 0.0148095	total: 9.85s	remaining: 13.3s
103:	learn: 0.0146887	total: 10s	remaining: 13.3s
104:	learn: 0.0145453	total: 10.2s	remaining: 13.3s
105:	learn: 0.0143729	total: 10.3s	remaining: 13.3s
106:	learn: 0.0142526	total: 10.5s	remaining: 13.3s
107:	learn: 0.0141153	total: 10.7s	remaining: 13.2s
108:	learn: 0.0139839	total: 10.8s	remaining: 13.2s
109:	learn: 0.0138367	total: 11s	remaining: 13.2s
110:	learn: 0.0136940	total: 11.1s	remaining: 13.1s
111:	learn: 0.0135957	total: 11.3s	remaining: 13.1s
112:	learn: 0.0134714	total: 11.4s	remaining: 13.1s
113:	learn: 0.0133254	total: 11.6s	remaining: 13.1s
114:	learn: 0.0131940	total: 11.8s	remaining: 13s
115:	learn: 0.0130898	total: 11.9s	remaining: 13s
116:	learn: 0.0129113	total: 12.1s	remaining: 12.9s
117:	learn: 0.0128408	total: 12.2s	remaining: 12.9s
118:	learn: 0.0126786	total: 12.4s	remaining: 12.8s
119:	learn: 0.0125188	total: 12.6s	remaining: 12.8s
120:	learn: 0.0123760	total: 12.7s	remaining: 12.7s
121:	learn: 0.0122508	total: 12.8s	remaining: 12.6s
122:	learn: 0.0120928	total: 13s	remaining: 12.6s
123:	learn: 0.0119837	total: 13.1s	remaining: 12.5s
124:	learn: 0.0118666	total: 13.2s	remaining: 12.4s
125:	learn: 0.0117380	total: 13.3s	remaining: 12.2s
126:	learn: 0.0116741	total: 13.4s	remaining: 12.1s
127:	learn: 0.0115812	total: 13.5s	remaining: 12s
128:	learn: 0.0114850	total: 13.5s	remaining: 11.9s
129:	learn: 0.0114022	total: 13.6s	remaining: 11.7s
130:	learn: 0.0112989	total: 13.7s	remaining: 11.6s
131:	learn: 0.0112012	total: 13.8s	remaining: 11.5s
132:	learn: 0.0111498	total: 13.9s	remaining: 11.4s
133:	learn: 0.0110393	total: 13.9s	remaining: 11.2s
134:	learn: 0.0109323	total: 14s	remaining: 11.1s
135:	learn: 0.0107719	total: 14.1s	remaining: 11s
136:	learn: 0.0107068	total: 14.2s	remaining: 10.9s
137:	learn: 0.0106307	total: 14.3s	remaining: 10.8s
138:	learn: 0.0104789	total: 14.4s	remaining: 10.7s
139:	learn: 0.0104104	total: 14.5s	remaining: 10.5s
140:	learn: 0.0103325	total: 14.5s	remaining: 10.4s
141:	learn: 0.0101919	total: 14.6s	remaining: 10.3s
142:	learn: 0.0100657	total: 14.7s	remaining: 10.2s
143:	learn: 0.0100010	total: 14.8s	remaining: 10.1s
144:	learn: 0.0098998	total: 14.9s	remaining: 9.96s
145:	learn: 0.0097843	total: 15s	remaining: 9.84s
146:	learn: 0.0097201	total: 15s	remaining: 9.71s
147:	learn: 0.0096365	total: 15.1s	remaining: 9.6s
148:	learn: 0.0095069	total: 15.2s	remaining: 9.5s
149:	learn: 0.0094046	total: 15.3s	remaining: 9.39s
150:	learn: 0.0093213	total: 15.4s	remaining: 9.28s
151:	learn: 0.0092492	total: 15.5s	remaining: 9.17s
152:	learn: 0.0091845	total: 15.6s	remaining: 9.05s
153:	learn: 0.0091060	total: 15.7s	remaining: 8.94s
154:	learn: 0.0089458	total: 15.7s	remaining: 8.83s
155:	learn: 0.0088830	total: 15.8s	remaining: 8.72s
156:	learn: 0.0088042	total: 15.9s	remaining: 8.61s
157:	learn: 0.0087599	total: 16s	remaining: 8.49s
158:	learn: 0.0087114	total: 16s	remaining: 8.37s
159:	learn: 0.0086232	total: 16.1s	remaining: 8.27s
160:	learn: 0.0085569	total: 16.2s	remaining: 8.16s
161:	learn: 0.0084839	total: 16.3s	remaining: 8.05s
162:	learn: 0.0083917	total: 16.4s	remaining: 7.95s
163:	learn: 0.0083563	total: 16.5s	remaining: 7.84s
164:	learn: 0.0082453	total: 16.6s	remaining: 7.72s
165:	learn: 0.0081897	total: 16.6s	remaining: 7.62s
166:	learn: 0.0081463	total: 16.7s	remaining: 7.51s
167:	learn: 0.0080958	total: 16.8s	remaining: 7.4s
168:	learn: 0.0080535	total: 16.9s	remaining: 7.29s
169:	learn: 0.0080073	total: 17s	remaining: 7.18s
170:	learn: 0.0079414	total: 17s	remaining: 7.08s
171:	learn: 0.0078954	total: 17.1s	remaining: 6.97s
172:	learn: 0.0078020	total: 17.2s	remaining: 6.87s
173:	learn: 0.0077646	total: 17.3s	remaining: 6.76s
174:	learn: 0.0076849	total: 17.4s	remaining: 6.66s
175:	learn: 0.0076596	total: 17.5s	remaining: 6.55s
176:	learn: 0.0076002	total: 17.6s	remaining: 6.45s
177:	learn: 0.0075705	total: 17.6s	remaining: 6.35s
178:	learn: 0.0074746	total: 17.7s	remaining: 6.24s
179:	learn: 0.0074119	total: 17.8s	remaining: 6.13s
180:	learn: 0.0073480	total: 17.9s	remaining: 6.03s
181:	learn: 0.0073241	total: 18s	remaining: 5.92s
182:	learn: 0.0073016	total: 18s	remaining: 5.82s
183:	learn: 0.0072423	total: 18.1s	remaining: 5.72s
184:	learn: 0.0071752	total: 18.2s	remaining: 5.62s
185:	learn: 0.0071227	total: 18.3s	remaining: 5.51s
186:	learn: 0.0070938	total: 18.4s	remaining: 5.41s
187:	learn: 0.0070683	total: 18.5s	remaining: 5.31s
188:	learn: 0.0070119	total: 18.6s	remaining: 5.21s
189:	learn: 0.0069534	total: 18.7s	remaining: 5.11s
190:	learn: 0.0068968	total: 18.7s	remaining: 5s
191:	learn: 0.0068444	total: 18.8s	remaining: 4.89s
192:	learn: 0.0068027	total: 18.9s	remaining: 4.8s
193:	learn: 0.0067407	total: 19s	remaining: 4.69s
194:	learn: 0.0066843	total: 19.1s	remaining: 4.59s
195:	learn: 0.0066224	total: 19.1s	remaining: 4.49s
196:	learn: 0.0065634	total: 19.2s	remaining: 4.39s
197:	learn: 0.0064892	total: 19.3s	remaining: 4.29s
198:	learn: 0.0064285	total: 19.4s	remaining: 4.2s
199:	learn: 0.0064105	total: 19.5s	remaining: 4.09s
200:	learn: 0.0063789	total: 19.6s	remaining: 3.99s
201:	learn: 0.0062792	total: 19.7s	remaining: 3.89s
202:	learn: 0.0062470	total: 19.7s	remaining: 3.79s
203:	learn: 0.0062170	total: 19.8s	remaining: 3.69s
204:	learn: 0.0061723	total: 19.9s	remaining: 3.6s
205:	learn: 0.0061446	total: 20s	remaining: 3.5s
206:	learn: 0.0061180	total: 20.1s	remaining: 3.4s
207:	learn: 0.0060904	total: 20.2s	remaining: 3.3s
208:	learn: 0.0060681	total: 20.2s	remaining: 3.2s
209:	learn: 0.0060457	total: 20.3s	remaining: 3.1s
210:	learn: 0.0059857	total: 20.4s	remaining: 3s
211:	learn: 0.0059458	total: 20.5s	remaining: 2.9s
212:	learn: 0.0059186	total: 20.6s	remaining: 2.8s
213:	learn: 0.0058504	total: 20.7s	remaining: 2.7s
214:	learn: 0.0057995	total: 20.8s	remaining: 2.61s
215:	learn: 0.0057296	total: 20.8s	remaining: 2.51s
216:	learn: 0.0057083	total: 20.9s	remaining: 2.41s
217:	learn: 0.0056854	total: 21s	remaining: 2.31s
218:	learn: 0.0056335	total: 21.1s	remaining: 2.21s
219:	learn: 0.0055680	total: 21.2s	remaining: 2.12s
220:	learn: 0.0055391	total: 21.2s	remaining: 2.02s
221:	learn: 0.0055170	total: 21.3s	remaining: 1.92s
222:	learn: 0.0054766	total: 21.4s	remaining: 1.83s
223:	learn: 0.0054297	total: 21.5s	remaining: 1.73s
224:	learn: 0.0053920	total: 21.6s	remaining: 1.63s
225:	learn: 0.0053636	total: 21.7s	remaining: 1.53s
226:	learn: 0.0052910	total: 21.8s	remaining: 1.44s
227:	learn: 0.0052573	total: 21.8s	remaining: 1.34s
228:	learn: 0.0052371	total: 21.9s	remaining: 1.25s
229:	learn: 0.0051859	total: 22s	remaining: 1.15s
230:	learn: 0.0051686	total: 22.1s	remaining: 1.05s
231:	learn: 0.0051600	total: 22.2s	remaining: 955ms
232:	learn: 0.0051410	total: 22.2s	remaining: 859ms
233:	learn: 0.0051339	total: 22.3s	remaining: 762ms
234:	learn: 0.0051161	total: 22.4s	remaining: 667ms
235:	learn: 0.0050703	total: 22.5s	remaining: 572ms
236:	learn: 0.0050141	total: 22.6s	remaining: 476ms
237:	learn: 0.0049776	total: 22.7s	remaining: 381ms
238:	learn: 0.0049531	total: 22.8s	remaining: 286ms
239:	learn: 0.0049346	total: 22.8s	remaining: 190ms
240:	learn: 0.0049006	total: 22.9s	remaining: 95.1ms
241:	learn: 0.0048674	total: 23s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.38
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.69
 - F1-Score_Train: 99.69
 - Precision_Test: 16.19
 - Recall_Test: 85.71
 - AUPRC_Test: 75.88
 - Accuracy_Test: 99.23
 - F1-Score_Test: 27.24
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 242
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5800480	total: 148ms	remaining: 35.6s
1:	learn: 0.4826641	total: 302ms	remaining: 36.3s
2:	learn: 0.3991852	total: 446ms	remaining: 35.6s
3:	learn: 0.3404135	total: 602ms	remaining: 35.8s
4:	learn: 0.2991011	total: 778ms	remaining: 36.9s
5:	learn: 0.2623428	total: 934ms	remaining: 36.7s
6:	learn: 0.2271460	total: 1.11s	remaining: 37.3s
7:	learn: 0.2033475	total: 1.27s	remaining: 37.1s
8:	learn: 0.1840732	total: 1.43s	remaining: 37s
9:	learn: 0.1726910	total: 1.58s	remaining: 36.7s
10:	learn: 0.1575042	total: 1.74s	remaining: 36.5s
11:	learn: 0.1474575	total: 1.88s	remaining: 36.1s
12:	learn: 0.1397908	total: 2.03s	remaining: 35.7s
13:	learn: 0.1317097	total: 2.1s	remaining: 34.3s
14:	learn: 0.1237683	total: 2.18s	remaining: 33s
15:	learn: 0.1156990	total: 2.29s	remaining: 32.3s
16:	learn: 0.1106430	total: 2.38s	remaining: 31.4s
17:	learn: 0.1062264	total: 2.45s	remaining: 30.5s
18:	learn: 0.1022485	total: 2.54s	remaining: 29.9s
19:	learn: 0.0980197	total: 2.63s	remaining: 29.1s
20:	learn: 0.0952028	total: 2.7s	remaining: 28.4s
21:	learn: 0.0906891	total: 2.8s	remaining: 28s
22:	learn: 0.0881055	total: 2.88s	remaining: 27.4s
23:	learn: 0.0852833	total: 2.96s	remaining: 26.9s
24:	learn: 0.0831713	total: 3.05s	remaining: 26.5s
25:	learn: 0.0799389	total: 3.13s	remaining: 26s
26:	learn: 0.0779846	total: 3.21s	remaining: 25.6s
27:	learn: 0.0761556	total: 3.31s	remaining: 25.3s
28:	learn: 0.0740902	total: 3.4s	remaining: 25s
29:	learn: 0.0724660	total: 3.47s	remaining: 24.5s
30:	learn: 0.0708454	total: 3.56s	remaining: 24.2s
31:	learn: 0.0683686	total: 3.64s	remaining: 23.9s
32:	learn: 0.0667387	total: 3.72s	remaining: 23.6s
33:	learn: 0.0648571	total: 3.82s	remaining: 23.4s
34:	learn: 0.0630485	total: 3.91s	remaining: 23.1s
35:	learn: 0.0619593	total: 3.99s	remaining: 22.8s
36:	learn: 0.0602613	total: 4.08s	remaining: 22.6s
37:	learn: 0.0592493	total: 4.15s	remaining: 22.3s
38:	learn: 0.0580228	total: 4.23s	remaining: 22s
39:	learn: 0.0571673	total: 4.32s	remaining: 21.8s
40:	learn: 0.0556541	total: 4.41s	remaining: 21.6s
41:	learn: 0.0547912	total: 4.49s	remaining: 21.4s
42:	learn: 0.0540403	total: 4.58s	remaining: 21.2s
43:	learn: 0.0524412	total: 4.66s	remaining: 21s
44:	learn: 0.0518473	total: 4.73s	remaining: 20.7s
45:	learn: 0.0510572	total: 4.83s	remaining: 20.6s
46:	learn: 0.0503117	total: 4.91s	remaining: 20.4s
47:	learn: 0.0496812	total: 4.98s	remaining: 20.1s
48:	learn: 0.0486736	total: 5.1s	remaining: 20.1s
49:	learn: 0.0481201	total: 5.17s	remaining: 19.9s
50:	learn: 0.0474163	total: 5.25s	remaining: 19.7s
51:	learn: 0.0462444	total: 5.34s	remaining: 19.5s
52:	learn: 0.0453436	total: 5.44s	remaining: 19.4s
53:	learn: 0.0443704	total: 5.52s	remaining: 19.2s
54:	learn: 0.0434270	total: 5.61s	remaining: 19.1s
55:	learn: 0.0425558	total: 5.7s	remaining: 18.9s
56:	learn: 0.0417906	total: 5.78s	remaining: 18.7s
57:	learn: 0.0409026	total: 5.88s	remaining: 18.7s
58:	learn: 0.0404454	total: 5.96s	remaining: 18.5s
59:	learn: 0.0397562	total: 6.04s	remaining: 18.3s
60:	learn: 0.0389717	total: 6.13s	remaining: 18.2s
61:	learn: 0.0383514	total: 6.21s	remaining: 18s
62:	learn: 0.0376061	total: 6.29s	remaining: 17.9s
63:	learn: 0.0371336	total: 6.39s	remaining: 17.8s
64:	learn: 0.0364385	total: 6.49s	remaining: 17.7s
65:	learn: 0.0359022	total: 6.56s	remaining: 17.5s
66:	learn: 0.0354995	total: 6.65s	remaining: 17.4s
67:	learn: 0.0348638	total: 6.73s	remaining: 17.2s
68:	learn: 0.0343038	total: 6.81s	remaining: 17.1s
69:	learn: 0.0335641	total: 6.94s	remaining: 17.1s
70:	learn: 0.0330453	total: 7.03s	remaining: 16.9s
71:	learn: 0.0326026	total: 7.11s	remaining: 16.8s
72:	learn: 0.0323044	total: 7.2s	remaining: 16.7s
73:	learn: 0.0320010	total: 7.27s	remaining: 16.5s
74:	learn: 0.0315107	total: 7.35s	remaining: 16.4s
75:	learn: 0.0311493	total: 7.46s	remaining: 16.3s
76:	learn: 0.0308090	total: 7.54s	remaining: 16.2s
77:	learn: 0.0302714	total: 7.62s	remaining: 16s
78:	learn: 0.0300276	total: 7.71s	remaining: 15.9s
79:	learn: 0.0294393	total: 7.79s	remaining: 15.8s
80:	learn: 0.0291201	total: 7.88s	remaining: 15.7s
81:	learn: 0.0286267	total: 7.99s	remaining: 15.6s
82:	learn: 0.0282048	total: 8.08s	remaining: 15.5s
83:	learn: 0.0278629	total: 8.16s	remaining: 15.3s
84:	learn: 0.0273919	total: 8.25s	remaining: 15.2s
85:	learn: 0.0271696	total: 8.32s	remaining: 15.1s
86:	learn: 0.0267619	total: 8.4s	remaining: 15s
87:	learn: 0.0265228	total: 8.51s	remaining: 14.9s
88:	learn: 0.0262372	total: 8.59s	remaining: 14.8s
89:	learn: 0.0258535	total: 8.67s	remaining: 14.6s
90:	learn: 0.0254858	total: 8.76s	remaining: 14.5s
91:	learn: 0.0252172	total: 8.84s	remaining: 14.4s
92:	learn: 0.0248412	total: 8.93s	remaining: 14.3s
93:	learn: 0.0246169	total: 9.02s	remaining: 14.2s
94:	learn: 0.0243360	total: 9.1s	remaining: 14.1s
95:	learn: 0.0240018	total: 9.18s	remaining: 14s
96:	learn: 0.0237511	total: 9.27s	remaining: 13.9s
97:	learn: 0.0234646	total: 9.35s	remaining: 13.7s
98:	learn: 0.0231516	total: 9.43s	remaining: 13.6s
99:	learn: 0.0229440	total: 9.54s	remaining: 13.5s
100:	learn: 0.0226304	total: 9.62s	remaining: 13.4s
101:	learn: 0.0223932	total: 9.7s	remaining: 13.3s
102:	learn: 0.0221116	total: 9.8s	remaining: 13.2s
103:	learn: 0.0219188	total: 9.88s	remaining: 13.1s
104:	learn: 0.0216369	total: 9.96s	remaining: 13s
105:	learn: 0.0213788	total: 10.1s	remaining: 12.9s
106:	learn: 0.0211801	total: 10.1s	remaining: 12.8s
107:	learn: 0.0209982	total: 10.2s	remaining: 12.7s
108:	learn: 0.0207731	total: 10.3s	remaining: 12.6s
109:	learn: 0.0205803	total: 10.4s	remaining: 12.5s
110:	learn: 0.0202263	total: 10.5s	remaining: 12.3s
111:	learn: 0.0199897	total: 10.6s	remaining: 12.3s
112:	learn: 0.0196862	total: 10.7s	remaining: 12.2s
113:	learn: 0.0194958	total: 10.7s	remaining: 12.1s
114:	learn: 0.0192586	total: 10.9s	remaining: 12s
115:	learn: 0.0189945	total: 10.9s	remaining: 11.9s
116:	learn: 0.0188593	total: 11s	remaining: 11.7s
117:	learn: 0.0186629	total: 11.1s	remaining: 11.7s
118:	learn: 0.0185002	total: 11.2s	remaining: 11.5s
119:	learn: 0.0183425	total: 11.2s	remaining: 11.4s
120:	learn: 0.0181733	total: 11.3s	remaining: 11.3s
121:	learn: 0.0180758	total: 11.4s	remaining: 11.2s
122:	learn: 0.0179634	total: 11.5s	remaining: 11.1s
123:	learn: 0.0178407	total: 11.6s	remaining: 11s
124:	learn: 0.0177109	total: 11.7s	remaining: 10.9s
125:	learn: 0.0175455	total: 11.7s	remaining: 10.8s
126:	learn: 0.0173416	total: 11.9s	remaining: 10.7s
127:	learn: 0.0171067	total: 11.9s	remaining: 10.6s
128:	learn: 0.0168592	total: 12.1s	remaining: 10.6s
129:	learn: 0.0166770	total: 12.2s	remaining: 10.5s
130:	learn: 0.0165386	total: 12.3s	remaining: 10.5s
131:	learn: 0.0163976	total: 12.5s	remaining: 10.4s
132:	learn: 0.0162687	total: 12.7s	remaining: 10.4s
133:	learn: 0.0160869	total: 12.8s	remaining: 10.3s
134:	learn: 0.0159343	total: 13s	remaining: 10.3s
135:	learn: 0.0157848	total: 13.1s	remaining: 10.2s
136:	learn: 0.0156302	total: 13.3s	remaining: 10.2s
137:	learn: 0.0155497	total: 13.4s	remaining: 10.1s
138:	learn: 0.0154083	total: 13.6s	remaining: 10.1s
139:	learn: 0.0152470	total: 13.7s	remaining: 10s
140:	learn: 0.0150713	total: 13.9s	remaining: 9.95s
141:	learn: 0.0149654	total: 14s	remaining: 9.89s
142:	learn: 0.0148634	total: 14.2s	remaining: 9.83s
143:	learn: 0.0147178	total: 14.3s	remaining: 9.77s
144:	learn: 0.0145728	total: 14.5s	remaining: 9.71s
145:	learn: 0.0145133	total: 14.7s	remaining: 9.63s
146:	learn: 0.0144266	total: 14.8s	remaining: 9.57s
147:	learn: 0.0142528	total: 15s	remaining: 9.5s
148:	learn: 0.0141668	total: 15.1s	remaining: 9.43s
149:	learn: 0.0140469	total: 15.3s	remaining: 9.36s
150:	learn: 0.0139475	total: 15.4s	remaining: 9.28s
151:	learn: 0.0138432	total: 15.5s	remaining: 9.21s
152:	learn: 0.0137540	total: 15.7s	remaining: 9.13s
153:	learn: 0.0136321	total: 15.9s	remaining: 9.06s
154:	learn: 0.0135122	total: 16s	remaining: 8.98s
155:	learn: 0.0134061	total: 16.2s	remaining: 8.9s
156:	learn: 0.0132278	total: 16.3s	remaining: 8.83s
157:	learn: 0.0131217	total: 16.5s	remaining: 8.75s
158:	learn: 0.0130308	total: 16.6s	remaining: 8.67s
159:	learn: 0.0128658	total: 16.8s	remaining: 8.59s
160:	learn: 0.0127436	total: 16.9s	remaining: 8.52s
161:	learn: 0.0126399	total: 17.1s	remaining: 8.43s
162:	learn: 0.0125127	total: 17.2s	remaining: 8.35s
163:	learn: 0.0123933	total: 17.4s	remaining: 8.27s
164:	learn: 0.0123016	total: 17.5s	remaining: 8.18s
165:	learn: 0.0121320	total: 17.6s	remaining: 8.06s
166:	learn: 0.0120370	total: 17.7s	remaining: 7.95s
167:	learn: 0.0119683	total: 17.8s	remaining: 7.83s
168:	learn: 0.0119203	total: 17.9s	remaining: 7.72s
169:	learn: 0.0118641	total: 17.9s	remaining: 7.6s
170:	learn: 0.0117299	total: 18s	remaining: 7.48s
171:	learn: 0.0116315	total: 18.1s	remaining: 7.38s
172:	learn: 0.0115447	total: 18.2s	remaining: 7.26s
173:	learn: 0.0114585	total: 18.3s	remaining: 7.14s
174:	learn: 0.0113327	total: 18.4s	remaining: 7.04s
175:	learn: 0.0112441	total: 18.5s	remaining: 6.92s
176:	learn: 0.0111540	total: 18.5s	remaining: 6.8s
177:	learn: 0.0110956	total: 18.6s	remaining: 6.69s
178:	learn: 0.0110245	total: 18.7s	remaining: 6.58s
179:	learn: 0.0109562	total: 18.8s	remaining: 6.46s
180:	learn: 0.0108931	total: 18.9s	remaining: 6.35s
181:	learn: 0.0108111	total: 18.9s	remaining: 6.24s
182:	learn: 0.0107267	total: 19s	remaining: 6.13s
183:	learn: 0.0106514	total: 19.1s	remaining: 6.02s
184:	learn: 0.0106119	total: 19.2s	remaining: 5.91s
185:	learn: 0.0105511	total: 19.3s	remaining: 5.8s
186:	learn: 0.0104217	total: 19.4s	remaining: 5.69s
187:	learn: 0.0103240	total: 19.4s	remaining: 5.58s
188:	learn: 0.0102488	total: 19.5s	remaining: 5.47s
189:	learn: 0.0101228	total: 19.6s	remaining: 5.37s
190:	learn: 0.0100530	total: 19.7s	remaining: 5.26s
191:	learn: 0.0099766	total: 19.8s	remaining: 5.15s
192:	learn: 0.0099193	total: 19.9s	remaining: 5.05s
193:	learn: 0.0098239	total: 20s	remaining: 4.94s
194:	learn: 0.0097436	total: 20s	remaining: 4.83s
195:	learn: 0.0096853	total: 20.1s	remaining: 4.73s
196:	learn: 0.0096639	total: 20.2s	remaining: 4.62s
197:	learn: 0.0095824	total: 20.3s	remaining: 4.51s
198:	learn: 0.0094974	total: 20.4s	remaining: 4.41s
199:	learn: 0.0094440	total: 20.5s	remaining: 4.3s
200:	learn: 0.0093990	total: 20.6s	remaining: 4.19s
201:	learn: 0.0093612	total: 20.6s	remaining: 4.09s
202:	learn: 0.0092734	total: 20.7s	remaining: 3.98s
203:	learn: 0.0092213	total: 20.8s	remaining: 3.88s
204:	learn: 0.0091179	total: 20.9s	remaining: 3.77s
205:	learn: 0.0090282	total: 21s	remaining: 3.67s
206:	learn: 0.0089867	total: 21.1s	remaining: 3.56s
207:	learn: 0.0089364	total: 21.2s	remaining: 3.46s
208:	learn: 0.0089000	total: 21.2s	remaining: 3.35s
209:	learn: 0.0088448	total: 21.3s	remaining: 3.25s
210:	learn: 0.0088007	total: 21.4s	remaining: 3.15s
211:	learn: 0.0087585	total: 21.5s	remaining: 3.04s
212:	learn: 0.0087243	total: 21.6s	remaining: 2.93s
213:	learn: 0.0086795	total: 21.6s	remaining: 2.83s
214:	learn: 0.0086441	total: 21.7s	remaining: 2.73s
215:	learn: 0.0085714	total: 21.8s	remaining: 2.62s
216:	learn: 0.0085414	total: 21.9s	remaining: 2.52s
217:	learn: 0.0083988	total: 22s	remaining: 2.42s
218:	learn: 0.0083636	total: 22.1s	remaining: 2.31s
219:	learn: 0.0083239	total: 22.1s	remaining: 2.21s
220:	learn: 0.0082669	total: 22.2s	remaining: 2.11s
221:	learn: 0.0082248	total: 22.3s	remaining: 2.01s
222:	learn: 0.0081854	total: 22.4s	remaining: 1.91s
223:	learn: 0.0081122	total: 22.5s	remaining: 1.81s
224:	learn: 0.0080474	total: 22.6s	remaining: 1.7s
225:	learn: 0.0079691	total: 22.7s	remaining: 1.6s
226:	learn: 0.0079515	total: 22.7s	remaining: 1.5s
227:	learn: 0.0078917	total: 22.8s	remaining: 1.4s
228:	learn: 0.0078669	total: 22.9s	remaining: 1.3s
229:	learn: 0.0078102	total: 23s	remaining: 1.2s
230:	learn: 0.0077767	total: 23.1s	remaining: 1.1s
231:	learn: 0.0077255	total: 23.2s	remaining: 998ms
232:	learn: 0.0076440	total: 23.2s	remaining: 898ms
233:	learn: 0.0075937	total: 23.3s	remaining: 797ms
234:	learn: 0.0075535	total: 23.4s	remaining: 697ms
235:	learn: 0.0075185	total: 23.5s	remaining: 597ms
236:	learn: 0.0074932	total: 23.5s	remaining: 497ms
237:	learn: 0.0074358	total: 23.6s	remaining: 397ms
238:	learn: 0.0073866	total: 23.7s	remaining: 298ms
239:	learn: 0.0073607	total: 23.8s	remaining: 198ms
240:	learn: 0.0072926	total: 23.9s	remaining: 99.1ms
241:	learn: 0.0072390	total: 24s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.57
 - F1-Score_Train: 99.57
 - Precision_Test: 14.41
 - Recall_Test: 88.89
 - AUPRC_Test: 70.79
 - Accuracy_Test: 99.09
 - F1-Score_Test: 24.81
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 242
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5762286	total: 94ms	remaining: 22.6s
1:	learn: 0.4737941	total: 169ms	remaining: 20.3s
2:	learn: 0.4043772	total: 251ms	remaining: 20s
3:	learn: 0.3366448	total: 341ms	remaining: 20.3s
4:	learn: 0.2911392	total: 421ms	remaining: 20s
5:	learn: 0.2528720	total: 502ms	remaining: 19.7s
6:	learn: 0.2219249	total: 607ms	remaining: 20.4s
7:	learn: 0.1971720	total: 682ms	remaining: 20s
8:	learn: 0.1794378	total: 761ms	remaining: 19.7s
9:	learn: 0.1647089	total: 857ms	remaining: 19.9s
10:	learn: 0.1528591	total: 930ms	remaining: 19.5s
11:	learn: 0.1401986	total: 1s	remaining: 19.2s
12:	learn: 0.1295716	total: 1.1s	remaining: 19.4s
13:	learn: 0.1233977	total: 1.18s	remaining: 19.3s
14:	learn: 0.1166931	total: 1.26s	remaining: 19.1s
15:	learn: 0.1092159	total: 1.35s	remaining: 19.1s
16:	learn: 0.1036637	total: 1.45s	remaining: 19.2s
17:	learn: 0.0971062	total: 1.57s	remaining: 19.6s
18:	learn: 0.0919935	total: 1.72s	remaining: 20.2s
19:	learn: 0.0883310	total: 1.88s	remaining: 20.8s
20:	learn: 0.0852354	total: 2.03s	remaining: 21.4s
21:	learn: 0.0827856	total: 2.21s	remaining: 22.1s
22:	learn: 0.0794825	total: 2.37s	remaining: 22.6s
23:	learn: 0.0767182	total: 2.53s	remaining: 23s
24:	learn: 0.0741999	total: 2.69s	remaining: 23.3s
25:	learn: 0.0714597	total: 2.84s	remaining: 23.6s
26:	learn: 0.0700309	total: 2.95s	remaining: 23.5s
27:	learn: 0.0677623	total: 3.1s	remaining: 23.7s
28:	learn: 0.0652942	total: 3.25s	remaining: 23.9s
29:	learn: 0.0632668	total: 3.41s	remaining: 24.1s
30:	learn: 0.0606732	total: 3.56s	remaining: 24.3s
31:	learn: 0.0591981	total: 3.72s	remaining: 24.4s
32:	learn: 0.0577276	total: 3.88s	remaining: 24.6s
33:	learn: 0.0564894	total: 4.05s	remaining: 24.8s
34:	learn: 0.0550734	total: 4.2s	remaining: 24.9s
35:	learn: 0.0540742	total: 4.37s	remaining: 25s
36:	learn: 0.0532037	total: 4.53s	remaining: 25.1s
37:	learn: 0.0520927	total: 4.71s	remaining: 25.3s
38:	learn: 0.0506753	total: 4.86s	remaining: 25.3s
39:	learn: 0.0495169	total: 5.02s	remaining: 25.3s
40:	learn: 0.0482057	total: 5.16s	remaining: 25.3s
41:	learn: 0.0471648	total: 5.31s	remaining: 25.3s
42:	learn: 0.0464988	total: 5.46s	remaining: 25.3s
43:	learn: 0.0457888	total: 5.6s	remaining: 25.2s
44:	learn: 0.0449941	total: 5.75s	remaining: 25.2s
45:	learn: 0.0442889	total: 5.91s	remaining: 25.2s
46:	learn: 0.0433459	total: 6.07s	remaining: 25.2s
47:	learn: 0.0427764	total: 6.25s	remaining: 25.2s
48:	learn: 0.0419092	total: 6.41s	remaining: 25.2s
49:	learn: 0.0411441	total: 6.56s	remaining: 25.2s
50:	learn: 0.0403437	total: 6.71s	remaining: 25.1s
51:	learn: 0.0396267	total: 6.84s	remaining: 25s
52:	learn: 0.0389974	total: 6.99s	remaining: 24.9s
53:	learn: 0.0383361	total: 7.12s	remaining: 24.8s
54:	learn: 0.0376804	total: 7.19s	remaining: 24.4s
55:	learn: 0.0369729	total: 7.28s	remaining: 24.2s
56:	learn: 0.0363457	total: 7.37s	remaining: 23.9s
57:	learn: 0.0356806	total: 7.47s	remaining: 23.7s
58:	learn: 0.0351850	total: 7.54s	remaining: 23.4s
59:	learn: 0.0346131	total: 7.64s	remaining: 23.2s
60:	learn: 0.0341605	total: 7.72s	remaining: 22.9s
61:	learn: 0.0338456	total: 7.79s	remaining: 22.6s
62:	learn: 0.0333597	total: 7.89s	remaining: 22.4s
63:	learn: 0.0328407	total: 7.96s	remaining: 22.2s
64:	learn: 0.0324112	total: 8.04s	remaining: 21.9s
65:	learn: 0.0316797	total: 8.14s	remaining: 21.7s
66:	learn: 0.0313856	total: 8.22s	remaining: 21.5s
67:	learn: 0.0309075	total: 8.29s	remaining: 21.2s
68:	learn: 0.0304842	total: 8.39s	remaining: 21s
69:	learn: 0.0302505	total: 8.47s	remaining: 20.8s
70:	learn: 0.0297237	total: 8.55s	remaining: 20.6s
71:	learn: 0.0292897	total: 8.64s	remaining: 20.4s
72:	learn: 0.0287987	total: 8.74s	remaining: 20.2s
73:	learn: 0.0282442	total: 8.82s	remaining: 20s
74:	learn: 0.0278969	total: 8.94s	remaining: 19.9s
75:	learn: 0.0275125	total: 9.01s	remaining: 19.7s
76:	learn: 0.0270992	total: 9.09s	remaining: 19.5s
77:	learn: 0.0266590	total: 9.19s	remaining: 19.3s
78:	learn: 0.0263369	total: 9.27s	remaining: 19.1s
79:	learn: 0.0259279	total: 9.35s	remaining: 18.9s
80:	learn: 0.0255989	total: 9.45s	remaining: 18.8s
81:	learn: 0.0252424	total: 9.54s	remaining: 18.6s
82:	learn: 0.0250118	total: 9.62s	remaining: 18.4s
83:	learn: 0.0247540	total: 9.72s	remaining: 18.3s
84:	learn: 0.0244624	total: 9.79s	remaining: 18.1s
85:	learn: 0.0242013	total: 9.88s	remaining: 17.9s
86:	learn: 0.0239350	total: 9.97s	remaining: 17.8s
87:	learn: 0.0236989	total: 10s	remaining: 17.6s
88:	learn: 0.0235130	total: 10.1s	remaining: 17.4s
89:	learn: 0.0232217	total: 10.2s	remaining: 17.2s
90:	learn: 0.0230155	total: 10.3s	remaining: 17.1s
91:	learn: 0.0227056	total: 10.4s	remaining: 16.9s
92:	learn: 0.0224174	total: 10.5s	remaining: 16.8s
93:	learn: 0.0221509	total: 10.5s	remaining: 16.6s
94:	learn: 0.0219567	total: 10.6s	remaining: 16.4s
95:	learn: 0.0217483	total: 10.7s	remaining: 16.3s
96:	learn: 0.0214893	total: 10.8s	remaining: 16.2s
97:	learn: 0.0212214	total: 10.9s	remaining: 16s
98:	learn: 0.0209267	total: 11s	remaining: 15.9s
99:	learn: 0.0206773	total: 11.1s	remaining: 15.7s
100:	learn: 0.0203610	total: 11.2s	remaining: 15.6s
101:	learn: 0.0201046	total: 11.3s	remaining: 15.4s
102:	learn: 0.0198542	total: 11.3s	remaining: 15.3s
103:	learn: 0.0196116	total: 11.4s	remaining: 15.1s
104:	learn: 0.0194502	total: 11.5s	remaining: 15s
105:	learn: 0.0193211	total: 11.6s	remaining: 14.9s
106:	learn: 0.0189579	total: 11.7s	remaining: 14.8s
107:	learn: 0.0186529	total: 11.8s	remaining: 14.6s
108:	learn: 0.0184244	total: 11.9s	remaining: 14.5s
109:	learn: 0.0182876	total: 11.9s	remaining: 14.3s
110:	learn: 0.0181163	total: 12s	remaining: 14.2s
111:	learn: 0.0179463	total: 12.1s	remaining: 14.1s
112:	learn: 0.0178095	total: 12.2s	remaining: 13.9s
113:	learn: 0.0175779	total: 12.3s	remaining: 13.8s
114:	learn: 0.0173125	total: 12.4s	remaining: 13.7s
115:	learn: 0.0172162	total: 12.4s	remaining: 13.5s
116:	learn: 0.0169900	total: 12.5s	remaining: 13.4s
117:	learn: 0.0168364	total: 12.6s	remaining: 13.3s
118:	learn: 0.0166655	total: 12.7s	remaining: 13.2s
119:	learn: 0.0164644	total: 12.8s	remaining: 13s
120:	learn: 0.0162951	total: 12.9s	remaining: 12.9s
121:	learn: 0.0161522	total: 13s	remaining: 12.8s
122:	learn: 0.0160139	total: 13.1s	remaining: 12.7s
123:	learn: 0.0157872	total: 13.2s	remaining: 12.5s
124:	learn: 0.0157005	total: 13.2s	remaining: 12.4s
125:	learn: 0.0155613	total: 13.3s	remaining: 12.3s
126:	learn: 0.0153881	total: 13.4s	remaining: 12.1s
127:	learn: 0.0151733	total: 13.5s	remaining: 12s
128:	learn: 0.0150000	total: 13.6s	remaining: 11.9s
129:	learn: 0.0148980	total: 13.7s	remaining: 11.8s
130:	learn: 0.0148399	total: 13.8s	remaining: 11.7s
131:	learn: 0.0147657	total: 13.8s	remaining: 11.5s
132:	learn: 0.0145541	total: 13.9s	remaining: 11.4s
133:	learn: 0.0144390	total: 14s	remaining: 11.3s
134:	learn: 0.0143175	total: 14.1s	remaining: 11.2s
135:	learn: 0.0141751	total: 14.2s	remaining: 11.1s
136:	learn: 0.0139848	total: 14.3s	remaining: 10.9s
137:	learn: 0.0139098	total: 14.4s	remaining: 10.8s
138:	learn: 0.0138260	total: 14.5s	remaining: 10.7s
139:	learn: 0.0136905	total: 14.5s	remaining: 10.6s
140:	learn: 0.0135754	total: 14.6s	remaining: 10.5s
141:	learn: 0.0134517	total: 14.7s	remaining: 10.4s
142:	learn: 0.0133482	total: 14.8s	remaining: 10.2s
143:	learn: 0.0132258	total: 14.9s	remaining: 10.1s
144:	learn: 0.0131035	total: 15s	remaining: 10s
145:	learn: 0.0129598	total: 15.1s	remaining: 9.9s
146:	learn: 0.0128575	total: 15.1s	remaining: 9.79s
147:	learn: 0.0127535	total: 15.2s	remaining: 9.67s
148:	learn: 0.0126633	total: 15.3s	remaining: 9.55s
149:	learn: 0.0125449	total: 15.4s	remaining: 9.45s
150:	learn: 0.0124170	total: 15.5s	remaining: 9.33s
151:	learn: 0.0123660	total: 15.5s	remaining: 9.21s
152:	learn: 0.0123155	total: 15.6s	remaining: 9.1s
153:	learn: 0.0121829	total: 15.7s	remaining: 8.99s
154:	learn: 0.0120335	total: 15.8s	remaining: 8.88s
155:	learn: 0.0119822	total: 15.9s	remaining: 8.77s
156:	learn: 0.0119019	total: 16s	remaining: 8.65s
157:	learn: 0.0117860	total: 16.1s	remaining: 8.54s
158:	learn: 0.0117106	total: 16.2s	remaining: 8.43s
159:	learn: 0.0115582	total: 16.2s	remaining: 8.32s
160:	learn: 0.0114400	total: 16.3s	remaining: 8.21s
161:	learn: 0.0113033	total: 16.4s	remaining: 8.1s
162:	learn: 0.0111904	total: 16.5s	remaining: 7.99s
163:	learn: 0.0111120	total: 16.6s	remaining: 7.87s
164:	learn: 0.0109878	total: 16.6s	remaining: 7.77s
165:	learn: 0.0109267	total: 16.7s	remaining: 7.66s
166:	learn: 0.0108539	total: 16.8s	remaining: 7.55s
167:	learn: 0.0107858	total: 16.9s	remaining: 7.46s
168:	learn: 0.0106333	total: 17s	remaining: 7.35s
169:	learn: 0.0105707	total: 17.1s	remaining: 7.26s
170:	learn: 0.0104726	total: 17.3s	remaining: 7.18s
171:	learn: 0.0103877	total: 17.5s	remaining: 7.1s
172:	learn: 0.0102940	total: 17.6s	remaining: 7.02s
173:	learn: 0.0101888	total: 17.8s	remaining: 6.95s
174:	learn: 0.0101513	total: 17.9s	remaining: 6.87s
175:	learn: 0.0101094	total: 18.1s	remaining: 6.78s
176:	learn: 0.0100667	total: 18.2s	remaining: 6.7s
177:	learn: 0.0099968	total: 18.4s	remaining: 6.62s
178:	learn: 0.0099340	total: 18.5s	remaining: 6.53s
179:	learn: 0.0098977	total: 18.7s	remaining: 6.44s
180:	learn: 0.0097724	total: 18.8s	remaining: 6.35s
181:	learn: 0.0096805	total: 19s	remaining: 6.27s
182:	learn: 0.0095700	total: 19.1s	remaining: 6.16s
183:	learn: 0.0094984	total: 19.3s	remaining: 6.08s
184:	learn: 0.0093930	total: 19.4s	remaining: 5.99s
185:	learn: 0.0093243	total: 19.6s	remaining: 5.9s
186:	learn: 0.0092432	total: 19.8s	remaining: 5.81s
187:	learn: 0.0091817	total: 19.9s	remaining: 5.73s
188:	learn: 0.0091325	total: 20.1s	remaining: 5.63s
189:	learn: 0.0090687	total: 20.3s	remaining: 5.54s
190:	learn: 0.0089341	total: 20.4s	remaining: 5.45s
191:	learn: 0.0089012	total: 20.6s	remaining: 5.36s
192:	learn: 0.0088059	total: 20.7s	remaining: 5.26s
193:	learn: 0.0087825	total: 20.9s	remaining: 5.17s
194:	learn: 0.0087396	total: 21s	remaining: 5.07s
195:	learn: 0.0086696	total: 21.2s	remaining: 4.98s
196:	learn: 0.0085965	total: 21.4s	remaining: 4.88s
197:	learn: 0.0085272	total: 21.5s	remaining: 4.78s
198:	learn: 0.0084481	total: 21.7s	remaining: 4.68s
199:	learn: 0.0084089	total: 21.8s	remaining: 4.58s
200:	learn: 0.0083292	total: 22s	remaining: 4.49s
201:	learn: 0.0082445	total: 22.1s	remaining: 4.38s
202:	learn: 0.0082060	total: 22.3s	remaining: 4.28s
203:	learn: 0.0081369	total: 22.4s	remaining: 4.18s
204:	learn: 0.0080546	total: 22.5s	remaining: 4.07s
205:	learn: 0.0080095	total: 22.6s	remaining: 3.95s
206:	learn: 0.0079638	total: 22.7s	remaining: 3.84s
207:	learn: 0.0079197	total: 22.8s	remaining: 3.72s
208:	learn: 0.0078851	total: 22.9s	remaining: 3.61s
209:	learn: 0.0078025	total: 22.9s	remaining: 3.5s
210:	learn: 0.0077253	total: 23s	remaining: 3.39s
211:	learn: 0.0076714	total: 23.1s	remaining: 3.27s
212:	learn: 0.0076273	total: 23.2s	remaining: 3.16s
213:	learn: 0.0075954	total: 23.3s	remaining: 3.05s
214:	learn: 0.0075326	total: 23.4s	remaining: 2.93s
215:	learn: 0.0075059	total: 23.5s	remaining: 2.82s
216:	learn: 0.0074829	total: 23.5s	remaining: 2.71s
217:	learn: 0.0074298	total: 23.6s	remaining: 2.6s
218:	learn: 0.0073841	total: 23.7s	remaining: 2.49s
219:	learn: 0.0073479	total: 23.8s	remaining: 2.38s
220:	learn: 0.0073039	total: 23.8s	remaining: 2.26s
221:	learn: 0.0072767	total: 23.9s	remaining: 2.15s
222:	learn: 0.0072575	total: 24s	remaining: 2.04s
223:	learn: 0.0071671	total: 24.1s	remaining: 1.94s
224:	learn: 0.0071302	total: 24.2s	remaining: 1.83s
225:	learn: 0.0070938	total: 24.3s	remaining: 1.72s
226:	learn: 0.0070520	total: 24.3s	remaining: 1.61s
227:	learn: 0.0069769	total: 24.4s	remaining: 1.5s
228:	learn: 0.0069510	total: 24.5s	remaining: 1.39s
229:	learn: 0.0068783	total: 24.6s	remaining: 1.28s
230:	learn: 0.0068435	total: 24.7s	remaining: 1.18s
231:	learn: 0.0067848	total: 24.8s	remaining: 1.07s
232:	learn: 0.0067415	total: 24.8s	remaining: 959ms
233:	learn: 0.0067233	total: 24.9s	remaining: 852ms
234:	learn: 0.0067025	total: 25s	remaining: 745ms
235:	learn: 0.0066435	total: 25.1s	remaining: 638ms
236:	learn: 0.0066063	total: 25.2s	remaining: 532ms
237:	learn: 0.0065748	total: 25.3s	remaining: 425ms
238:	learn: 0.0065442	total: 25.3s	remaining: 318ms
239:	learn: 0.0064952	total: 25.4s	remaining: 212ms
240:	learn: 0.0064765	total: 25.5s	remaining: 106ms
241:	learn: 0.0064198	total: 25.6s	remaining: 0us
[I 2024-12-19 14:35:38,534] Trial 23 finished with value: 72.8818479109883 and parameters: {'learning_rate': 0.049655270980061436, 'max_depth': 5, 'n_estimators': 242, 'scale_pos_weight': 6.987733767823069}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.15
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.57
 - F1-Score_Train: 99.57
 - Precision_Test: 13.96
 - Recall_Test: 86.51
 - AUPRC_Test: 71.97
 - Accuracy_Test: 99.08
 - F1-Score_Test: 24.04
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 242
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.99
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 72.8818

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5074617	total: 69.9ms	remaining: 13.2s
1:	learn: 0.3757317	total: 137ms	remaining: 12.8s
2:	learn: 0.2922161	total: 207ms	remaining: 12.9s
3:	learn: 0.2386786	total: 277ms	remaining: 12.9s
4:	learn: 0.1896158	total: 363ms	remaining: 13.4s
5:	learn: 0.1578107	total: 451ms	remaining: 13.8s
6:	learn: 0.1291689	total: 538ms	remaining: 14.1s
7:	learn: 0.1186391	total: 605ms	remaining: 13.8s
8:	learn: 0.1067697	total: 673ms	remaining: 13.5s
9:	learn: 0.1003814	total: 741ms	remaining: 13.3s
10:	learn: 0.0938674	total: 823ms	remaining: 13.4s
11:	learn: 0.0881967	total: 894ms	remaining: 13.3s
12:	learn: 0.0817590	total: 975ms	remaining: 13.3s
13:	learn: 0.0778147	total: 1.04s	remaining: 13.1s
14:	learn: 0.0741068	total: 1.1s	remaining: 12.9s
15:	learn: 0.0706719	total: 1.17s	remaining: 12.7s
16:	learn: 0.0682529	total: 1.27s	remaining: 12.9s
17:	learn: 0.0649101	total: 1.34s	remaining: 12.8s
18:	learn: 0.0623396	total: 1.41s	remaining: 12.7s
19:	learn: 0.0603312	total: 1.51s	remaining: 12.8s
20:	learn: 0.0586013	total: 1.58s	remaining: 12.7s
21:	learn: 0.0559033	total: 1.65s	remaining: 12.6s
22:	learn: 0.0536812	total: 1.73s	remaining: 12.6s
23:	learn: 0.0511860	total: 1.8s	remaining: 12.5s
24:	learn: 0.0496098	total: 1.87s	remaining: 12.3s
25:	learn: 0.0480617	total: 1.94s	remaining: 12.2s
26:	learn: 0.0467604	total: 2.02s	remaining: 12.2s
27:	learn: 0.0455441	total: 2.08s	remaining: 12.1s
28:	learn: 0.0443468	total: 2.16s	remaining: 12s
29:	learn: 0.0426677	total: 2.24s	remaining: 12s
30:	learn: 0.0419314	total: 2.31s	remaining: 11.9s
31:	learn: 0.0411149	total: 2.39s	remaining: 11.8s
32:	learn: 0.0398731	total: 2.48s	remaining: 11.8s
33:	learn: 0.0391118	total: 2.55s	remaining: 11.7s
34:	learn: 0.0377675	total: 2.64s	remaining: 11.7s
35:	learn: 0.0368740	total: 2.7s	remaining: 11.6s
36:	learn: 0.0361048	total: 2.77s	remaining: 11.5s
37:	learn: 0.0351417	total: 2.85s	remaining: 11.4s
38:	learn: 0.0342096	total: 2.95s	remaining: 11.4s
39:	learn: 0.0335405	total: 3.02s	remaining: 11.3s
40:	learn: 0.0329373	total: 3.1s	remaining: 11.3s
41:	learn: 0.0324615	total: 3.16s	remaining: 11.1s
42:	learn: 0.0316628	total: 3.24s	remaining: 11.1s
43:	learn: 0.0309346	total: 3.34s	remaining: 11.1s
44:	learn: 0.0305142	total: 3.42s	remaining: 11s
45:	learn: 0.0301120	total: 3.5s	remaining: 11s
46:	learn: 0.0292864	total: 3.59s	remaining: 10.9s
47:	learn: 0.0288465	total: 3.65s	remaining: 10.8s
48:	learn: 0.0283832	total: 3.72s	remaining: 10.7s
49:	learn: 0.0278991	total: 3.78s	remaining: 10.6s
50:	learn: 0.0274093	total: 3.88s	remaining: 10.6s
51:	learn: 0.0268187	total: 3.94s	remaining: 10.5s
52:	learn: 0.0264296	total: 4.02s	remaining: 10.4s
53:	learn: 0.0259678	total: 4.09s	remaining: 10.3s
54:	learn: 0.0256438	total: 4.17s	remaining: 10.2s
55:	learn: 0.0250905	total: 4.24s	remaining: 10.1s
56:	learn: 0.0247661	total: 4.33s	remaining: 10.1s
57:	learn: 0.0243512	total: 4.39s	remaining: 10s
58:	learn: 0.0240648	total: 4.46s	remaining: 9.9s
59:	learn: 0.0236708	total: 4.55s	remaining: 9.87s
60:	learn: 0.0232872	total: 4.62s	remaining: 9.78s
61:	learn: 0.0228860	total: 4.74s	remaining: 9.79s
62:	learn: 0.0224921	total: 4.87s	remaining: 9.81s
63:	learn: 0.0222449	total: 4.98s	remaining: 9.81s
64:	learn: 0.0219159	total: 5.11s	remaining: 9.83s
65:	learn: 0.0215126	total: 5.25s	remaining: 9.87s
66:	learn: 0.0211416	total: 5.38s	remaining: 9.89s
67:	learn: 0.0209524	total: 5.53s	remaining: 9.91s
68:	learn: 0.0205225	total: 5.66s	remaining: 9.93s
69:	learn: 0.0202060	total: 5.8s	remaining: 9.95s
70:	learn: 0.0199582	total: 5.96s	remaining: 9.99s
71:	learn: 0.0197366	total: 6.1s	remaining: 9.99s
72:	learn: 0.0193811	total: 6.23s	remaining: 9.99s
73:	learn: 0.0190377	total: 6.36s	remaining: 9.96s
74:	learn: 0.0186955	total: 6.5s	remaining: 9.97s
75:	learn: 0.0184537	total: 6.64s	remaining: 9.96s
76:	learn: 0.0181493	total: 6.79s	remaining: 9.97s
77:	learn: 0.0179853	total: 6.92s	remaining: 9.94s
78:	learn: 0.0178674	total: 7.09s	remaining: 9.97s
79:	learn: 0.0176729	total: 7.23s	remaining: 9.94s
80:	learn: 0.0173681	total: 7.38s	remaining: 9.94s
81:	learn: 0.0171531	total: 7.54s	remaining: 9.93s
82:	learn: 0.0169024	total: 7.69s	remaining: 9.91s
83:	learn: 0.0166297	total: 7.81s	remaining: 9.86s
84:	learn: 0.0164239	total: 7.95s	remaining: 9.82s
85:	learn: 0.0161959	total: 8.09s	remaining: 9.78s
86:	learn: 0.0159488	total: 8.24s	remaining: 9.76s
87:	learn: 0.0157396	total: 8.38s	remaining: 9.72s
88:	learn: 0.0155115	total: 8.53s	remaining: 9.68s
89:	learn: 0.0153303	total: 8.65s	remaining: 9.61s
90:	learn: 0.0151442	total: 8.81s	remaining: 9.59s
91:	learn: 0.0149826	total: 8.95s	remaining: 9.53s
92:	learn: 0.0147858	total: 9.11s	remaining: 9.5s
93:	learn: 0.0146608	total: 9.24s	remaining: 9.44s
94:	learn: 0.0145392	total: 9.39s	remaining: 9.39s
95:	learn: 0.0143942	total: 9.53s	remaining: 9.34s
96:	learn: 0.0142530	total: 9.67s	remaining: 9.27s
97:	learn: 0.0141273	total: 9.81s	remaining: 9.21s
98:	learn: 0.0139256	total: 9.94s	remaining: 9.14s
99:	learn: 0.0137772	total: 10.1s	remaining: 9.06s
100:	learn: 0.0134962	total: 10.2s	remaining: 8.99s
101:	learn: 0.0133097	total: 10.3s	remaining: 8.87s
102:	learn: 0.0131412	total: 10.4s	remaining: 8.74s
103:	learn: 0.0130191	total: 10.4s	remaining: 8.63s
104:	learn: 0.0128642	total: 10.5s	remaining: 8.51s
105:	learn: 0.0127300	total: 10.6s	remaining: 8.38s
106:	learn: 0.0125773	total: 10.6s	remaining: 8.26s
107:	learn: 0.0123864	total: 10.7s	remaining: 8.14s
108:	learn: 0.0122907	total: 10.8s	remaining: 8.01s
109:	learn: 0.0120795	total: 10.9s	remaining: 7.93s
110:	learn: 0.0119847	total: 11s	remaining: 7.81s
111:	learn: 0.0118341	total: 11s	remaining: 7.69s
112:	learn: 0.0117357	total: 11.1s	remaining: 7.58s
113:	learn: 0.0116178	total: 11.2s	remaining: 7.47s
114:	learn: 0.0115123	total: 11.3s	remaining: 7.35s
115:	learn: 0.0114093	total: 11.3s	remaining: 7.24s
116:	learn: 0.0113604	total: 11.4s	remaining: 7.13s
117:	learn: 0.0112688	total: 11.5s	remaining: 7.02s
118:	learn: 0.0111164	total: 11.6s	remaining: 6.92s
119:	learn: 0.0109617	total: 11.7s	remaining: 6.8s
120:	learn: 0.0108267	total: 11.7s	remaining: 6.69s
121:	learn: 0.0106755	total: 11.8s	remaining: 6.58s
122:	learn: 0.0106081	total: 11.9s	remaining: 6.47s
123:	learn: 0.0104771	total: 12s	remaining: 6.37s
124:	learn: 0.0103586	total: 12.1s	remaining: 6.27s
125:	learn: 0.0102446	total: 12.1s	remaining: 6.16s
126:	learn: 0.0101497	total: 12.2s	remaining: 6.05s
127:	learn: 0.0099880	total: 12.3s	remaining: 5.95s
128:	learn: 0.0099506	total: 12.3s	remaining: 5.84s
129:	learn: 0.0098698	total: 12.4s	remaining: 5.73s
130:	learn: 0.0098090	total: 12.5s	remaining: 5.63s
131:	learn: 0.0097307	total: 12.6s	remaining: 5.53s
132:	learn: 0.0096341	total: 12.6s	remaining: 5.42s
133:	learn: 0.0095774	total: 12.7s	remaining: 5.32s
134:	learn: 0.0095036	total: 12.8s	remaining: 5.21s
135:	learn: 0.0094414	total: 12.9s	remaining: 5.11s
136:	learn: 0.0093262	total: 13s	remaining: 5.02s
137:	learn: 0.0091620	total: 13.1s	remaining: 4.93s
138:	learn: 0.0090383	total: 13.1s	remaining: 4.82s
139:	learn: 0.0089884	total: 13.2s	remaining: 4.72s
140:	learn: 0.0089313	total: 13.3s	remaining: 4.62s
141:	learn: 0.0088649	total: 13.4s	remaining: 4.52s
142:	learn: 0.0087933	total: 13.4s	remaining: 4.41s
143:	learn: 0.0087079	total: 13.5s	remaining: 4.32s
144:	learn: 0.0086294	total: 13.6s	remaining: 4.21s
145:	learn: 0.0085690	total: 13.6s	remaining: 4.11s
146:	learn: 0.0085037	total: 13.7s	remaining: 4.01s
147:	learn: 0.0084107	total: 13.8s	remaining: 3.92s
148:	learn: 0.0083421	total: 13.9s	remaining: 3.81s
149:	learn: 0.0082686	total: 13.9s	remaining: 3.72s
150:	learn: 0.0082279	total: 14s	remaining: 3.63s
151:	learn: 0.0081976	total: 14.1s	remaining: 3.53s
152:	learn: 0.0081545	total: 14.2s	remaining: 3.43s
153:	learn: 0.0081219	total: 14.3s	remaining: 3.34s
154:	learn: 0.0080097	total: 14.3s	remaining: 3.24s
155:	learn: 0.0078959	total: 14.4s	remaining: 3.15s
156:	learn: 0.0078285	total: 14.5s	remaining: 3.05s
157:	learn: 0.0077738	total: 14.6s	remaining: 2.95s
158:	learn: 0.0076651	total: 14.7s	remaining: 2.86s
159:	learn: 0.0076157	total: 14.7s	remaining: 2.76s
160:	learn: 0.0075255	total: 14.8s	remaining: 2.66s
161:	learn: 0.0075121	total: 14.9s	remaining: 2.57s
162:	learn: 0.0074607	total: 14.9s	remaining: 2.48s
163:	learn: 0.0073952	total: 15s	remaining: 2.38s
164:	learn: 0.0073444	total: 15.1s	remaining: 2.29s
165:	learn: 0.0073127	total: 15.2s	remaining: 2.19s
166:	learn: 0.0072372	total: 15.3s	remaining: 2.1s
167:	learn: 0.0072021	total: 15.3s	remaining: 2.01s
168:	learn: 0.0071365	total: 15.4s	remaining: 1.92s
169:	learn: 0.0070613	total: 15.5s	remaining: 1.82s
170:	learn: 0.0070136	total: 15.6s	remaining: 1.73s
171:	learn: 0.0069546	total: 15.6s	remaining: 1.64s
172:	learn: 0.0068933	total: 15.7s	remaining: 1.54s
173:	learn: 0.0068319	total: 15.8s	remaining: 1.45s
174:	learn: 0.0068221	total: 15.8s	remaining: 1.36s
175:	learn: 0.0067864	total: 15.9s	remaining: 1.27s
176:	learn: 0.0066998	total: 16s	remaining: 1.17s
177:	learn: 0.0066599	total: 16.1s	remaining: 1.08s
178:	learn: 0.0066169	total: 16.1s	remaining: 992ms
179:	learn: 0.0065725	total: 16.2s	remaining: 901ms
180:	learn: 0.0065225	total: 16.3s	remaining: 810ms
181:	learn: 0.0064781	total: 16.4s	remaining: 720ms
182:	learn: 0.0064237	total: 16.5s	remaining: 629ms
183:	learn: 0.0063710	total: 16.5s	remaining: 539ms
184:	learn: 0.0062931	total: 16.6s	remaining: 449ms
185:	learn: 0.0062564	total: 16.7s	remaining: 359ms
186:	learn: 0.0062225	total: 16.7s	remaining: 269ms
187:	learn: 0.0061847	total: 16.8s	remaining: 179ms
188:	learn: 0.0061620	total: 16.9s	remaining: 89.4ms
189:	learn: 0.0061169	total: 17s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.27
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.63
 - F1-Score_Train: 99.63
 - Precision_Test: 14.32
 - Recall_Test: 87.30
 - AUPRC_Test: 74.66
 - Accuracy_Test: 99.10
 - F1-Score_Test: 24.61
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 190
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.31
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5191676	total: 74.3ms	remaining: 14s
1:	learn: 0.3922853	total: 168ms	remaining: 15.7s
2:	learn: 0.3085411	total: 244ms	remaining: 15.2s
3:	learn: 0.2479370	total: 329ms	remaining: 15.3s
4:	learn: 0.2101545	total: 397ms	remaining: 14.7s
5:	learn: 0.1875862	total: 463ms	remaining: 14.2s
6:	learn: 0.1613021	total: 551ms	remaining: 14.4s
7:	learn: 0.1503609	total: 620ms	remaining: 14.1s
8:	learn: 0.1398509	total: 692ms	remaining: 13.9s
9:	learn: 0.1295494	total: 783ms	remaining: 14.1s
10:	learn: 0.1237444	total: 861ms	remaining: 14s
11:	learn: 0.1169181	total: 934ms	remaining: 13.8s
12:	learn: 0.1121668	total: 997ms	remaining: 13.6s
13:	learn: 0.1075971	total: 1.07s	remaining: 13.5s
14:	learn: 0.1039929	total: 1.2s	remaining: 14s
15:	learn: 0.1007366	total: 1.32s	remaining: 14.4s
16:	learn: 0.0970978	total: 1.46s	remaining: 14.9s
17:	learn: 0.0942000	total: 1.58s	remaining: 15.1s
18:	learn: 0.0904362	total: 1.75s	remaining: 15.8s
19:	learn: 0.0875699	total: 1.9s	remaining: 16.2s
20:	learn: 0.0844232	total: 2.07s	remaining: 16.7s
21:	learn: 0.0814926	total: 2.21s	remaining: 16.8s
22:	learn: 0.0798695	total: 2.34s	remaining: 17s
23:	learn: 0.0772460	total: 2.49s	remaining: 17.2s
24:	learn: 0.0746818	total: 2.63s	remaining: 17.4s
25:	learn: 0.0731340	total: 2.74s	remaining: 17.3s
26:	learn: 0.0707610	total: 2.9s	remaining: 17.5s
27:	learn: 0.0686425	total: 3.03s	remaining: 17.5s
28:	learn: 0.0670686	total: 3.17s	remaining: 17.6s
29:	learn: 0.0653593	total: 3.31s	remaining: 17.6s
30:	learn: 0.0635019	total: 3.48s	remaining: 17.8s
31:	learn: 0.0614349	total: 3.6s	remaining: 17.8s
32:	learn: 0.0599860	total: 3.77s	remaining: 17.9s
33:	learn: 0.0587089	total: 3.92s	remaining: 18s
34:	learn: 0.0578372	total: 4.07s	remaining: 18s
35:	learn: 0.0571119	total: 4.21s	remaining: 18s
36:	learn: 0.0559091	total: 4.37s	remaining: 18.1s
37:	learn: 0.0550533	total: 4.51s	remaining: 18s
38:	learn: 0.0542600	total: 4.63s	remaining: 17.9s
39:	learn: 0.0536494	total: 4.77s	remaining: 17.9s
40:	learn: 0.0526844	total: 4.92s	remaining: 17.9s
41:	learn: 0.0517221	total: 5.06s	remaining: 17.8s
42:	learn: 0.0510637	total: 5.21s	remaining: 17.8s
43:	learn: 0.0501320	total: 5.34s	remaining: 17.7s
44:	learn: 0.0493400	total: 5.48s	remaining: 17.7s
45:	learn: 0.0482730	total: 5.63s	remaining: 17.6s
46:	learn: 0.0475422	total: 5.79s	remaining: 17.6s
47:	learn: 0.0465233	total: 5.92s	remaining: 17.5s
48:	learn: 0.0458327	total: 6.07s	remaining: 17.5s
49:	learn: 0.0451770	total: 6.21s	remaining: 17.4s
50:	learn: 0.0439247	total: 6.36s	remaining: 17.3s
51:	learn: 0.0432623	total: 6.5s	remaining: 17.2s
52:	learn: 0.0422727	total: 6.64s	remaining: 17.2s
53:	learn: 0.0413338	total: 6.73s	remaining: 17s
54:	learn: 0.0405164	total: 6.8s	remaining: 16.7s
55:	learn: 0.0398686	total: 6.9s	remaining: 16.5s
56:	learn: 0.0388403	total: 6.97s	remaining: 16.3s
57:	learn: 0.0380704	total: 7.04s	remaining: 16s
58:	learn: 0.0374656	total: 7.12s	remaining: 15.8s
59:	learn: 0.0370322	total: 7.2s	remaining: 15.6s
60:	learn: 0.0358251	total: 7.28s	remaining: 15.4s
61:	learn: 0.0351957	total: 7.37s	remaining: 15.2s
62:	learn: 0.0347472	total: 7.44s	remaining: 15s
63:	learn: 0.0343000	total: 7.52s	remaining: 14.8s
64:	learn: 0.0338078	total: 7.6s	remaining: 14.6s
65:	learn: 0.0333591	total: 7.67s	remaining: 14.4s
66:	learn: 0.0327214	total: 7.75s	remaining: 14.2s
67:	learn: 0.0321985	total: 7.82s	remaining: 14s
68:	learn: 0.0317718	total: 7.91s	remaining: 13.9s
69:	learn: 0.0311904	total: 7.98s	remaining: 13.7s
70:	learn: 0.0307406	total: 8.06s	remaining: 13.5s
71:	learn: 0.0305123	total: 8.13s	remaining: 13.3s
72:	learn: 0.0301548	total: 8.2s	remaining: 13.1s
73:	learn: 0.0297200	total: 8.29s	remaining: 13s
74:	learn: 0.0294291	total: 8.36s	remaining: 12.8s
75:	learn: 0.0292062	total: 8.42s	remaining: 12.6s
76:	learn: 0.0289234	total: 8.48s	remaining: 12.5s
77:	learn: 0.0286288	total: 8.58s	remaining: 12.3s
78:	learn: 0.0281779	total: 8.65s	remaining: 12.2s
79:	learn: 0.0277464	total: 8.72s	remaining: 12s
80:	learn: 0.0272390	total: 8.8s	remaining: 11.8s
81:	learn: 0.0269014	total: 8.87s	remaining: 11.7s
82:	learn: 0.0265283	total: 8.94s	remaining: 11.5s
83:	learn: 0.0263034	total: 9.02s	remaining: 11.4s
84:	learn: 0.0260224	total: 9.1s	remaining: 11.2s
85:	learn: 0.0257202	total: 9.17s	remaining: 11.1s
86:	learn: 0.0254055	total: 9.25s	remaining: 11s
87:	learn: 0.0250070	total: 9.32s	remaining: 10.8s
88:	learn: 0.0247559	total: 9.39s	remaining: 10.7s
89:	learn: 0.0244796	total: 9.46s	remaining: 10.5s
90:	learn: 0.0242446	total: 9.57s	remaining: 10.4s
91:	learn: 0.0239558	total: 9.64s	remaining: 10.3s
92:	learn: 0.0236513	total: 9.73s	remaining: 10.2s
93:	learn: 0.0233383	total: 9.8s	remaining: 10s
94:	learn: 0.0229904	total: 9.87s	remaining: 9.87s
95:	learn: 0.0227842	total: 9.94s	remaining: 9.73s
96:	learn: 0.0224930	total: 10s	remaining: 9.6s
97:	learn: 0.0220479	total: 10.1s	remaining: 9.47s
98:	learn: 0.0217476	total: 10.2s	remaining: 9.35s
99:	learn: 0.0216271	total: 10.2s	remaining: 9.21s
100:	learn: 0.0214578	total: 10.3s	remaining: 9.08s
101:	learn: 0.0211904	total: 10.4s	remaining: 8.95s
102:	learn: 0.0209847	total: 10.4s	remaining: 8.83s
103:	learn: 0.0206828	total: 10.5s	remaining: 8.7s
104:	learn: 0.0204269	total: 10.6s	remaining: 8.59s
105:	learn: 0.0202585	total: 10.7s	remaining: 8.47s
106:	learn: 0.0200222	total: 10.8s	remaining: 8.35s
107:	learn: 0.0198511	total: 10.8s	remaining: 8.23s
108:	learn: 0.0195282	total: 10.9s	remaining: 8.11s
109:	learn: 0.0192641	total: 11s	remaining: 7.99s
110:	learn: 0.0191147	total: 11.1s	remaining: 7.87s
111:	learn: 0.0188692	total: 11.1s	remaining: 7.76s
112:	learn: 0.0186537	total: 11.2s	remaining: 7.64s
113:	learn: 0.0184609	total: 11.3s	remaining: 7.53s
114:	learn: 0.0182268	total: 11.4s	remaining: 7.41s
115:	learn: 0.0180972	total: 11.4s	remaining: 7.29s
116:	learn: 0.0179109	total: 11.5s	remaining: 7.17s
117:	learn: 0.0177091	total: 11.6s	remaining: 7.07s
118:	learn: 0.0175989	total: 11.7s	remaining: 6.96s
119:	learn: 0.0174625	total: 11.8s	remaining: 6.86s
120:	learn: 0.0172532	total: 11.8s	remaining: 6.74s
121:	learn: 0.0170603	total: 11.9s	remaining: 6.63s
122:	learn: 0.0169351	total: 12s	remaining: 6.51s
123:	learn: 0.0168025	total: 12.1s	remaining: 6.42s
124:	learn: 0.0166608	total: 12.1s	remaining: 6.3s
125:	learn: 0.0165655	total: 12.2s	remaining: 6.2s
126:	learn: 0.0163925	total: 12.3s	remaining: 6.09s
127:	learn: 0.0162527	total: 12.3s	remaining: 5.98s
128:	learn: 0.0160126	total: 12.4s	remaining: 5.88s
129:	learn: 0.0159634	total: 12.5s	remaining: 5.77s
130:	learn: 0.0157909	total: 12.6s	remaining: 5.66s
131:	learn: 0.0156043	total: 12.6s	remaining: 5.55s
132:	learn: 0.0153561	total: 12.7s	remaining: 5.46s
133:	learn: 0.0152429	total: 12.8s	remaining: 5.35s
134:	learn: 0.0151247	total: 12.9s	remaining: 5.25s
135:	learn: 0.0150076	total: 12.9s	remaining: 5.14s
136:	learn: 0.0148726	total: 13s	remaining: 5.04s
137:	learn: 0.0147259	total: 13.1s	remaining: 4.93s
138:	learn: 0.0146454	total: 13.2s	remaining: 4.83s
139:	learn: 0.0145210	total: 13.2s	remaining: 4.72s
140:	learn: 0.0143438	total: 13.3s	remaining: 4.63s
141:	learn: 0.0142077	total: 13.4s	remaining: 4.52s
142:	learn: 0.0140208	total: 13.5s	remaining: 4.42s
143:	learn: 0.0139033	total: 13.5s	remaining: 4.32s
144:	learn: 0.0136680	total: 13.6s	remaining: 4.22s
145:	learn: 0.0136361	total: 13.7s	remaining: 4.12s
146:	learn: 0.0135481	total: 13.8s	remaining: 4.03s
147:	learn: 0.0134778	total: 13.8s	remaining: 3.93s
148:	learn: 0.0133891	total: 13.9s	remaining: 3.83s
149:	learn: 0.0132651	total: 14s	remaining: 3.73s
150:	learn: 0.0131196	total: 14.1s	remaining: 3.63s
151:	learn: 0.0129995	total: 14.1s	remaining: 3.53s
152:	learn: 0.0129165	total: 14.2s	remaining: 3.44s
153:	learn: 0.0128479	total: 14.3s	remaining: 3.34s
154:	learn: 0.0127230	total: 14.4s	remaining: 3.24s
155:	learn: 0.0125399	total: 14.4s	remaining: 3.14s
156:	learn: 0.0124186	total: 14.5s	remaining: 3.05s
157:	learn: 0.0122643	total: 14.6s	remaining: 2.95s
158:	learn: 0.0121461	total: 14.6s	remaining: 2.85s
159:	learn: 0.0120799	total: 14.7s	remaining: 2.76s
160:	learn: 0.0120300	total: 14.8s	remaining: 2.67s
161:	learn: 0.0119064	total: 14.9s	remaining: 2.57s
162:	learn: 0.0117878	total: 15s	remaining: 2.48s
163:	learn: 0.0116847	total: 15s	remaining: 2.38s
164:	learn: 0.0115772	total: 15.1s	remaining: 2.29s
165:	learn: 0.0115107	total: 15.2s	remaining: 2.19s
166:	learn: 0.0114441	total: 15.3s	remaining: 2.1s
167:	learn: 0.0113310	total: 15.3s	remaining: 2.01s
168:	learn: 0.0112722	total: 15.4s	remaining: 1.91s
169:	learn: 0.0112021	total: 15.5s	remaining: 1.82s
170:	learn: 0.0110565	total: 15.6s	remaining: 1.73s
171:	learn: 0.0109810	total: 15.6s	remaining: 1.64s
172:	learn: 0.0109121	total: 15.7s	remaining: 1.54s
173:	learn: 0.0107962	total: 15.8s	remaining: 1.45s
174:	learn: 0.0107114	total: 15.9s	remaining: 1.36s
175:	learn: 0.0106205	total: 15.9s	remaining: 1.27s
176:	learn: 0.0105994	total: 16s	remaining: 1.17s
177:	learn: 0.0104657	total: 16.1s	remaining: 1.08s
178:	learn: 0.0104470	total: 16.1s	remaining: 992ms
179:	learn: 0.0103714	total: 16.2s	remaining: 900ms
180:	learn: 0.0102809	total: 16.3s	remaining: 809ms
181:	learn: 0.0101566	total: 16.3s	remaining: 718ms
182:	learn: 0.0101399	total: 16.4s	remaining: 628ms
183:	learn: 0.0100574	total: 16.5s	remaining: 537ms
184:	learn: 0.0099363	total: 16.5s	remaining: 447ms
185:	learn: 0.0098496	total: 16.6s	remaining: 357ms
186:	learn: 0.0097605	total: 16.7s	remaining: 268ms
187:	learn: 0.0097391	total: 16.9s	remaining: 179ms
188:	learn: 0.0096242	total: 17s	remaining: 90ms
189:	learn: 0.0095364	total: 17.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.94
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.46
 - F1-Score_Train: 99.47
 - Precision_Test: 12.15
 - Recall_Test: 89.68
 - AUPRC_Test: 75.28
 - Accuracy_Test: 98.89
 - F1-Score_Test: 21.40
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 190
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.31
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5167763	total: 139ms	remaining: 26.2s
1:	learn: 0.3947374	total: 264ms	remaining: 24.8s
2:	learn: 0.2996259	total: 403ms	remaining: 25.1s
3:	learn: 0.2470049	total: 540ms	remaining: 25.1s
4:	learn: 0.1884808	total: 693ms	remaining: 25.6s
5:	learn: 0.1639673	total: 834ms	remaining: 25.6s
6:	learn: 0.1486639	total: 970ms	remaining: 25.4s
7:	learn: 0.1359328	total: 1.11s	remaining: 25.3s
8:	learn: 0.1221899	total: 1.26s	remaining: 25.3s
9:	learn: 0.1140899	total: 1.41s	remaining: 25.3s
10:	learn: 0.1052120	total: 1.54s	remaining: 25.1s
11:	learn: 0.0985596	total: 1.66s	remaining: 24.7s
12:	learn: 0.0915190	total: 1.81s	remaining: 24.7s
13:	learn: 0.0880369	total: 1.96s	remaining: 24.6s
14:	learn: 0.0828819	total: 2.06s	remaining: 24s
15:	learn: 0.0794132	total: 2.13s	remaining: 23.1s
16:	learn: 0.0748639	total: 2.2s	remaining: 22.4s
17:	learn: 0.0731498	total: 2.28s	remaining: 21.8s
18:	learn: 0.0711613	total: 2.36s	remaining: 21.2s
19:	learn: 0.0683295	total: 2.43s	remaining: 20.6s
20:	learn: 0.0657060	total: 2.51s	remaining: 20.2s
21:	learn: 0.0633875	total: 2.58s	remaining: 19.7s
22:	learn: 0.0616599	total: 2.65s	remaining: 19.2s
23:	learn: 0.0601534	total: 2.73s	remaining: 18.9s
24:	learn: 0.0591011	total: 2.81s	remaining: 18.6s
25:	learn: 0.0571084	total: 2.9s	remaining: 18.3s
26:	learn: 0.0561539	total: 2.98s	remaining: 18s
27:	learn: 0.0546075	total: 3.06s	remaining: 17.7s
28:	learn: 0.0537599	total: 3.12s	remaining: 17.3s
29:	learn: 0.0521172	total: 3.2s	remaining: 17s
30:	learn: 0.0514335	total: 3.27s	remaining: 16.8s
31:	learn: 0.0505630	total: 3.35s	remaining: 16.5s
32:	learn: 0.0498794	total: 3.43s	remaining: 16.3s
33:	learn: 0.0483479	total: 3.5s	remaining: 16.1s
34:	learn: 0.0474644	total: 3.57s	remaining: 15.8s
35:	learn: 0.0465316	total: 3.67s	remaining: 15.7s
36:	learn: 0.0456451	total: 3.76s	remaining: 15.5s
37:	learn: 0.0443911	total: 3.83s	remaining: 15.3s
38:	learn: 0.0433279	total: 3.92s	remaining: 15.2s
39:	learn: 0.0425835	total: 4s	remaining: 15s
40:	learn: 0.0418145	total: 4.07s	remaining: 14.8s
41:	learn: 0.0410444	total: 4.16s	remaining: 14.7s
42:	learn: 0.0404510	total: 4.23s	remaining: 14.5s
43:	learn: 0.0398851	total: 4.3s	remaining: 14.3s
44:	learn: 0.0392078	total: 4.4s	remaining: 14.2s
45:	learn: 0.0385670	total: 4.47s	remaining: 14s
46:	learn: 0.0378622	total: 4.54s	remaining: 13.8s
47:	learn: 0.0371641	total: 4.62s	remaining: 13.7s
48:	learn: 0.0365442	total: 4.7s	remaining: 13.5s
49:	learn: 0.0357153	total: 4.77s	remaining: 13.4s
50:	learn: 0.0350456	total: 4.85s	remaining: 13.2s
51:	learn: 0.0347946	total: 4.92s	remaining: 13.1s
52:	learn: 0.0344649	total: 5s	remaining: 12.9s
53:	learn: 0.0336861	total: 5.09s	remaining: 12.8s
54:	learn: 0.0330711	total: 5.16s	remaining: 12.7s
55:	learn: 0.0327210	total: 5.22s	remaining: 12.5s
56:	learn: 0.0323070	total: 5.29s	remaining: 12.4s
57:	learn: 0.0318396	total: 5.39s	remaining: 12.3s
58:	learn: 0.0314731	total: 5.46s	remaining: 12.1s
59:	learn: 0.0309592	total: 5.54s	remaining: 12s
60:	learn: 0.0302336	total: 5.61s	remaining: 11.9s
61:	learn: 0.0296207	total: 5.69s	remaining: 11.7s
62:	learn: 0.0294090	total: 5.75s	remaining: 11.6s
63:	learn: 0.0288925	total: 5.83s	remaining: 11.5s
64:	learn: 0.0284724	total: 5.91s	remaining: 11.4s
65:	learn: 0.0281048	total: 6s	remaining: 11.3s
66:	learn: 0.0278420	total: 6.07s	remaining: 11.1s
67:	learn: 0.0273659	total: 6.14s	remaining: 11s
68:	learn: 0.0271830	total: 6.22s	remaining: 10.9s
69:	learn: 0.0269035	total: 6.29s	remaining: 10.8s
70:	learn: 0.0266704	total: 6.36s	remaining: 10.7s
71:	learn: 0.0262508	total: 6.44s	remaining: 10.6s
72:	learn: 0.0258586	total: 6.51s	remaining: 10.4s
73:	learn: 0.0255094	total: 6.58s	remaining: 10.3s
74:	learn: 0.0251819	total: 6.66s	remaining: 10.2s
75:	learn: 0.0250345	total: 6.73s	remaining: 10.1s
76:	learn: 0.0247706	total: 6.8s	remaining: 9.97s
77:	learn: 0.0245022	total: 6.88s	remaining: 9.87s
78:	learn: 0.0242502	total: 6.96s	remaining: 9.77s
79:	learn: 0.0237962	total: 7.05s	remaining: 9.69s
80:	learn: 0.0234682	total: 7.14s	remaining: 9.61s
81:	learn: 0.0231609	total: 7.21s	remaining: 9.49s
82:	learn: 0.0230094	total: 7.27s	remaining: 9.37s
83:	learn: 0.0227929	total: 7.36s	remaining: 9.29s
84:	learn: 0.0225764	total: 7.43s	remaining: 9.18s
85:	learn: 0.0222169	total: 7.5s	remaining: 9.07s
86:	learn: 0.0218223	total: 7.59s	remaining: 8.98s
87:	learn: 0.0215055	total: 7.66s	remaining: 8.88s
88:	learn: 0.0212313	total: 7.73s	remaining: 8.77s
89:	learn: 0.0209619	total: 7.81s	remaining: 8.68s
90:	learn: 0.0205759	total: 7.9s	remaining: 8.59s
91:	learn: 0.0203911	total: 7.96s	remaining: 8.48s
92:	learn: 0.0202020	total: 8.07s	remaining: 8.42s
93:	learn: 0.0200062	total: 8.14s	remaining: 8.31s
94:	learn: 0.0196274	total: 8.21s	remaining: 8.21s
95:	learn: 0.0194910	total: 8.28s	remaining: 8.11s
96:	learn: 0.0191812	total: 8.37s	remaining: 8.03s
97:	learn: 0.0188818	total: 8.44s	remaining: 7.93s
98:	learn: 0.0186516	total: 8.53s	remaining: 7.84s
99:	learn: 0.0183407	total: 8.6s	remaining: 7.74s
100:	learn: 0.0180800	total: 8.67s	remaining: 7.64s
101:	learn: 0.0177787	total: 8.76s	remaining: 7.56s
102:	learn: 0.0175437	total: 8.83s	remaining: 7.46s
103:	learn: 0.0173666	total: 8.9s	remaining: 7.36s
104:	learn: 0.0173062	total: 8.96s	remaining: 7.25s
105:	learn: 0.0171184	total: 9.05s	remaining: 7.17s
106:	learn: 0.0169390	total: 9.13s	remaining: 7.08s
107:	learn: 0.0167721	total: 9.19s	remaining: 6.98s
108:	learn: 0.0166505	total: 9.28s	remaining: 6.9s
109:	learn: 0.0164441	total: 9.36s	remaining: 6.81s
110:	learn: 0.0162015	total: 9.43s	remaining: 6.71s
111:	learn: 0.0160468	total: 9.51s	remaining: 6.62s
112:	learn: 0.0158548	total: 9.59s	remaining: 6.53s
113:	learn: 0.0156516	total: 9.65s	remaining: 6.43s
114:	learn: 0.0153550	total: 9.74s	remaining: 6.35s
115:	learn: 0.0152360	total: 9.81s	remaining: 6.26s
116:	learn: 0.0151304	total: 9.87s	remaining: 6.16s
117:	learn: 0.0149826	total: 9.94s	remaining: 6.07s
118:	learn: 0.0148262	total: 10s	remaining: 5.98s
119:	learn: 0.0147577	total: 10.1s	remaining: 5.89s
120:	learn: 0.0145402	total: 10.2s	remaining: 5.81s
121:	learn: 0.0143527	total: 10.3s	remaining: 5.72s
122:	learn: 0.0142533	total: 10.3s	remaining: 5.63s
123:	learn: 0.0141372	total: 10.4s	remaining: 5.54s
124:	learn: 0.0139756	total: 10.5s	remaining: 5.46s
125:	learn: 0.0138666	total: 10.6s	remaining: 5.37s
126:	learn: 0.0137318	total: 10.7s	remaining: 5.29s
127:	learn: 0.0136694	total: 10.7s	remaining: 5.19s
128:	learn: 0.0135075	total: 10.8s	remaining: 5.1s
129:	learn: 0.0133702	total: 10.9s	remaining: 5.01s
130:	learn: 0.0132943	total: 10.9s	remaining: 4.93s
131:	learn: 0.0131647	total: 11s	remaining: 4.84s
132:	learn: 0.0130421	total: 11.1s	remaining: 4.76s
133:	learn: 0.0128954	total: 11.2s	remaining: 4.68s
134:	learn: 0.0127638	total: 11.2s	remaining: 4.58s
135:	learn: 0.0126111	total: 11.3s	remaining: 4.5s
136:	learn: 0.0125463	total: 11.4s	remaining: 4.41s
137:	learn: 0.0123991	total: 11.5s	remaining: 4.32s
138:	learn: 0.0122915	total: 11.6s	remaining: 4.24s
139:	learn: 0.0121978	total: 11.6s	remaining: 4.15s
140:	learn: 0.0120718	total: 11.7s	remaining: 4.07s
141:	learn: 0.0119512	total: 11.8s	remaining: 3.98s
142:	learn: 0.0117503	total: 11.9s	remaining: 3.9s
143:	learn: 0.0116502	total: 11.9s	remaining: 3.81s
144:	learn: 0.0115552	total: 12.1s	remaining: 3.74s
145:	learn: 0.0114522	total: 12.2s	remaining: 3.67s
146:	learn: 0.0113823	total: 12.3s	remaining: 3.6s
147:	learn: 0.0112578	total: 12.4s	remaining: 3.52s
148:	learn: 0.0111770	total: 12.5s	remaining: 3.45s
149:	learn: 0.0111313	total: 12.7s	remaining: 3.38s
150:	learn: 0.0110267	total: 12.8s	remaining: 3.31s
151:	learn: 0.0109181	total: 13s	remaining: 3.24s
152:	learn: 0.0107686	total: 13.1s	remaining: 3.17s
153:	learn: 0.0107009	total: 13.2s	remaining: 3.09s
154:	learn: 0.0105629	total: 13.4s	remaining: 3.02s
155:	learn: 0.0104789	total: 13.5s	remaining: 2.94s
156:	learn: 0.0103954	total: 13.6s	remaining: 2.87s
157:	learn: 0.0102465	total: 13.8s	remaining: 2.79s
158:	learn: 0.0101919	total: 13.9s	remaining: 2.71s
159:	learn: 0.0101270	total: 14.1s	remaining: 2.63s
160:	learn: 0.0100096	total: 14.2s	remaining: 2.56s
161:	learn: 0.0099216	total: 14.3s	remaining: 2.48s
162:	learn: 0.0098878	total: 14.5s	remaining: 2.4s
163:	learn: 0.0098043	total: 14.6s	remaining: 2.31s
164:	learn: 0.0097221	total: 14.7s	remaining: 2.23s
165:	learn: 0.0096908	total: 14.9s	remaining: 2.15s
166:	learn: 0.0095927	total: 15s	remaining: 2.07s
167:	learn: 0.0095166	total: 15.2s	remaining: 1.99s
168:	learn: 0.0094187	total: 15.3s	remaining: 1.9s
169:	learn: 0.0093575	total: 15.4s	remaining: 1.82s
170:	learn: 0.0092458	total: 15.6s	remaining: 1.73s
171:	learn: 0.0091989	total: 15.7s	remaining: 1.64s
172:	learn: 0.0091480	total: 15.9s	remaining: 1.56s
173:	learn: 0.0090694	total: 16s	remaining: 1.47s
174:	learn: 0.0089461	total: 16.1s	remaining: 1.38s
175:	learn: 0.0088879	total: 16.3s	remaining: 1.29s
176:	learn: 0.0088095	total: 16.4s	remaining: 1.21s
177:	learn: 0.0087581	total: 16.5s	remaining: 1.11s
178:	learn: 0.0086703	total: 16.7s	remaining: 1.02s
179:	learn: 0.0086274	total: 16.8s	remaining: 934ms
180:	learn: 0.0085646	total: 17s	remaining: 844ms
181:	learn: 0.0085087	total: 17.1s	remaining: 752ms
182:	learn: 0.0084610	total: 17.2s	remaining: 659ms
183:	learn: 0.0084051	total: 17.4s	remaining: 566ms
184:	learn: 0.0083595	total: 17.5s	remaining: 473ms
185:	learn: 0.0082348	total: 17.6s	remaining: 379ms
186:	learn: 0.0081356	total: 17.7s	remaining: 284ms
187:	learn: 0.0080649	total: 17.8s	remaining: 189ms
188:	learn: 0.0080095	total: 17.8s	remaining: 94.4ms
189:	learn: 0.0079604	total: 17.9s	remaining: 0us
[I 2024-12-19 14:36:38,173] Trial 24 finished with value: 73.65938167950104 and parameters: {'learning_rate': 0.08600313201336994, 'max_depth': 4, 'n_estimators': 190, 'scale_pos_weight': 6.312659747077875}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.99
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.49
 - F1-Score_Train: 99.49
 - Precision_Test: 12.69
 - Recall_Test: 87.30
 - AUPRC_Test: 71.04
 - Accuracy_Test: 98.97
 - F1-Score_Test: 22.16
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 190
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.31
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 73.6594

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5397280	total: 78.5ms	remaining: 20.6s
1:	learn: 0.4317848	total: 154ms	remaining: 20.2s
2:	learn: 0.3449528	total: 239ms	remaining: 20.8s
3:	learn: 0.2743022	total: 351ms	remaining: 22.8s
4:	learn: 0.2244071	total: 429ms	remaining: 22.2s
5:	learn: 0.1810411	total: 509ms	remaining: 21.9s
6:	learn: 0.1542563	total: 617ms	remaining: 22.7s
7:	learn: 0.1306620	total: 698ms	remaining: 22.3s
8:	learn: 0.1137963	total: 781ms	remaining: 22.1s
9:	learn: 0.1050070	total: 870ms	remaining: 22.1s
10:	learn: 0.0953802	total: 952ms	remaining: 21.9s
11:	learn: 0.0887270	total: 1.03s	remaining: 21.7s
12:	learn: 0.0836750	total: 1.13s	remaining: 21.8s
13:	learn: 0.0781029	total: 1.21s	remaining: 21.5s
14:	learn: 0.0710211	total: 1.29s	remaining: 21.4s
15:	learn: 0.0668410	total: 1.39s	remaining: 21.5s
16:	learn: 0.0628385	total: 1.47s	remaining: 21.4s
17:	learn: 0.0589215	total: 1.56s	remaining: 21.3s
18:	learn: 0.0556492	total: 1.66s	remaining: 21.4s
19:	learn: 0.0528697	total: 1.74s	remaining: 21.2s
20:	learn: 0.0507407	total: 1.81s	remaining: 21s
21:	learn: 0.0489323	total: 1.91s	remaining: 21s
22:	learn: 0.0471777	total: 1.99s	remaining: 20.8s
23:	learn: 0.0453722	total: 2.06s	remaining: 20.6s
24:	learn: 0.0439448	total: 2.16s	remaining: 20.6s
25:	learn: 0.0426304	total: 2.24s	remaining: 20.5s
26:	learn: 0.0411750	total: 2.32s	remaining: 20.4s
27:	learn: 0.0402327	total: 2.41s	remaining: 20.4s
28:	learn: 0.0393179	total: 2.49s	remaining: 20.2s
29:	learn: 0.0384471	total: 2.56s	remaining: 20s
30:	learn: 0.0376279	total: 2.67s	remaining: 20.1s
31:	learn: 0.0365733	total: 2.76s	remaining: 20s
32:	learn: 0.0358523	total: 2.83s	remaining: 19.8s
33:	learn: 0.0347039	total: 2.93s	remaining: 19.8s
34:	learn: 0.0338591	total: 3.01s	remaining: 19.7s
35:	learn: 0.0330649	total: 3.08s	remaining: 19.5s
36:	learn: 0.0321743	total: 3.19s	remaining: 19.5s
37:	learn: 0.0314839	total: 3.27s	remaining: 19.4s
38:	learn: 0.0309019	total: 3.34s	remaining: 19.3s
39:	learn: 0.0300868	total: 3.45s	remaining: 19.3s
40:	learn: 0.0294395	total: 3.52s	remaining: 19.2s
41:	learn: 0.0287213	total: 3.62s	remaining: 19.1s
42:	learn: 0.0281335	total: 3.72s	remaining: 19.1s
43:	learn: 0.0275930	total: 3.79s	remaining: 19s
44:	learn: 0.0267898	total: 3.88s	remaining: 18.9s
45:	learn: 0.0260412	total: 3.97s	remaining: 18.8s
46:	learn: 0.0257324	total: 4.05s	remaining: 18.7s
47:	learn: 0.0251402	total: 4.13s	remaining: 18.6s
48:	learn: 0.0246848	total: 4.23s	remaining: 18.6s
49:	learn: 0.0239463	total: 4.32s	remaining: 18.5s
50:	learn: 0.0236107	total: 4.39s	remaining: 18.3s
51:	learn: 0.0231797	total: 4.48s	remaining: 18.3s
52:	learn: 0.0228013	total: 4.56s	remaining: 18.2s
53:	learn: 0.0225029	total: 4.65s	remaining: 18.1s
54:	learn: 0.0221085	total: 4.75s	remaining: 18s
55:	learn: 0.0217567	total: 4.83s	remaining: 17.9s
56:	learn: 0.0214918	total: 4.89s	remaining: 17.8s
57:	learn: 0.0212231	total: 4.98s	remaining: 17.7s
58:	learn: 0.0207884	total: 5.07s	remaining: 17.6s
59:	learn: 0.0205848	total: 5.15s	remaining: 17.5s
60:	learn: 0.0200449	total: 5.26s	remaining: 17.5s
61:	learn: 0.0196842	total: 5.33s	remaining: 17.4s
62:	learn: 0.0194330	total: 5.41s	remaining: 17.3s
63:	learn: 0.0191434	total: 5.5s	remaining: 17.2s
64:	learn: 0.0187889	total: 5.57s	remaining: 17.1s
65:	learn: 0.0184539	total: 5.67s	remaining: 17s
66:	learn: 0.0181108	total: 5.77s	remaining: 17s
67:	learn: 0.0177342	total: 5.85s	remaining: 16.9s
68:	learn: 0.0175143	total: 5.93s	remaining: 16.8s
69:	learn: 0.0172898	total: 6.02s	remaining: 16.7s
70:	learn: 0.0170411	total: 6.1s	remaining: 16.6s
71:	learn: 0.0167846	total: 6.18s	remaining: 16.5s
72:	learn: 0.0165606	total: 6.27s	remaining: 16.4s
73:	learn: 0.0162937	total: 6.35s	remaining: 16.3s
74:	learn: 0.0161112	total: 6.43s	remaining: 16.2s
75:	learn: 0.0159706	total: 6.53s	remaining: 16.1s
76:	learn: 0.0157795	total: 6.61s	remaining: 16.1s
77:	learn: 0.0156201	total: 6.7s	remaining: 16s
78:	learn: 0.0154798	total: 6.8s	remaining: 15.9s
79:	learn: 0.0153245	total: 6.88s	remaining: 15.8s
80:	learn: 0.0151191	total: 6.95s	remaining: 15.7s
81:	learn: 0.0149432	total: 7.04s	remaining: 15.6s
82:	learn: 0.0147321	total: 7.13s	remaining: 15.5s
83:	learn: 0.0144813	total: 7.21s	remaining: 15.5s
84:	learn: 0.0143290	total: 7.3s	remaining: 15.4s
85:	learn: 0.0140862	total: 7.38s	remaining: 15.3s
86:	learn: 0.0139132	total: 7.45s	remaining: 15.2s
87:	learn: 0.0136622	total: 7.55s	remaining: 15.1s
88:	learn: 0.0134736	total: 7.68s	remaining: 15.1s
89:	learn: 0.0132734	total: 7.84s	remaining: 15.2s
90:	learn: 0.0131373	total: 7.95s	remaining: 15.1s
91:	learn: 0.0130071	total: 8.09s	remaining: 15.1s
92:	learn: 0.0128546	total: 8.24s	remaining: 15.2s
93:	learn: 0.0126575	total: 8.39s	remaining: 15.2s
94:	learn: 0.0124944	total: 8.54s	remaining: 15.2s
95:	learn: 0.0123403	total: 8.7s	remaining: 15.2s
96:	learn: 0.0121300	total: 8.87s	remaining: 15.3s
97:	learn: 0.0119818	total: 9.02s	remaining: 15.3s
98:	learn: 0.0118955	total: 9.17s	remaining: 15.3s
99:	learn: 0.0117037	total: 9.34s	remaining: 15.3s
100:	learn: 0.0115417	total: 9.48s	remaining: 15.3s
101:	learn: 0.0113875	total: 9.63s	remaining: 15.3s
102:	learn: 0.0112968	total: 9.78s	remaining: 15.3s
103:	learn: 0.0111813	total: 9.93s	remaining: 15.3s
104:	learn: 0.0110671	total: 10.1s	remaining: 15.3s
105:	learn: 0.0108560	total: 10.3s	remaining: 15.3s
106:	learn: 0.0107302	total: 10.4s	remaining: 15.3s
107:	learn: 0.0106162	total: 10.6s	remaining: 15.3s
108:	learn: 0.0104916	total: 10.7s	remaining: 15.3s
109:	learn: 0.0103607	total: 10.9s	remaining: 15.3s
110:	learn: 0.0102265	total: 11.1s	remaining: 15.2s
111:	learn: 0.0101585	total: 11.2s	remaining: 15.2s
112:	learn: 0.0099172	total: 11.3s	remaining: 15.2s
113:	learn: 0.0098014	total: 11.5s	remaining: 15.1s
114:	learn: 0.0097073	total: 11.6s	remaining: 15s
115:	learn: 0.0096410	total: 11.7s	remaining: 15s
116:	learn: 0.0095513	total: 11.9s	remaining: 15s
117:	learn: 0.0094335	total: 12.1s	remaining: 14.9s
118:	learn: 0.0093693	total: 12.2s	remaining: 14.9s
119:	learn: 0.0092599	total: 12.4s	remaining: 14.9s
120:	learn: 0.0092054	total: 12.6s	remaining: 14.8s
121:	learn: 0.0090721	total: 12.7s	remaining: 14.8s
122:	learn: 0.0089525	total: 12.9s	remaining: 14.8s
123:	learn: 0.0088750	total: 13.1s	remaining: 14.8s
124:	learn: 0.0087756	total: 13.2s	remaining: 14.7s
125:	learn: 0.0087119	total: 13.3s	remaining: 14.6s
126:	learn: 0.0086687	total: 13.4s	remaining: 14.4s
127:	learn: 0.0086106	total: 13.4s	remaining: 14.3s
128:	learn: 0.0085306	total: 13.5s	remaining: 14.1s
129:	learn: 0.0083965	total: 13.6s	remaining: 14.1s
130:	learn: 0.0083273	total: 13.7s	remaining: 13.9s
131:	learn: 0.0082795	total: 13.8s	remaining: 13.8s
132:	learn: 0.0081979	total: 13.9s	remaining: 13.7s
133:	learn: 0.0080759	total: 14s	remaining: 13.6s
134:	learn: 0.0080040	total: 14.1s	remaining: 13.4s
135:	learn: 0.0079439	total: 14.1s	remaining: 13.3s
136:	learn: 0.0078962	total: 14.2s	remaining: 13.2s
137:	learn: 0.0078148	total: 14.3s	remaining: 13.1s
138:	learn: 0.0077432	total: 14.4s	remaining: 13s
139:	learn: 0.0076434	total: 14.5s	remaining: 12.8s
140:	learn: 0.0075861	total: 14.6s	remaining: 12.7s
141:	learn: 0.0074839	total: 14.7s	remaining: 12.6s
142:	learn: 0.0074070	total: 14.8s	remaining: 12.5s
143:	learn: 0.0072802	total: 14.8s	remaining: 12.4s
144:	learn: 0.0071655	total: 14.9s	remaining: 12.2s
145:	learn: 0.0070647	total: 15s	remaining: 12.1s
146:	learn: 0.0070213	total: 15.1s	remaining: 12s
147:	learn: 0.0069536	total: 15.2s	remaining: 11.9s
148:	learn: 0.0068969	total: 15.3s	remaining: 11.8s
149:	learn: 0.0068254	total: 15.4s	remaining: 11.7s
150:	learn: 0.0067399	total: 15.4s	remaining: 11.5s
151:	learn: 0.0066975	total: 15.5s	remaining: 11.4s
152:	learn: 0.0066056	total: 15.6s	remaining: 11.3s
153:	learn: 0.0065490	total: 15.7s	remaining: 11.2s
154:	learn: 0.0064672	total: 15.8s	remaining: 11.1s
155:	learn: 0.0064254	total: 15.9s	remaining: 11s
156:	learn: 0.0063722	total: 15.9s	remaining: 10.9s
157:	learn: 0.0063191	total: 16s	remaining: 10.8s
158:	learn: 0.0062801	total: 16.1s	remaining: 10.6s
159:	learn: 0.0062128	total: 16.2s	remaining: 10.5s
160:	learn: 0.0061830	total: 16.3s	remaining: 10.4s
161:	learn: 0.0061474	total: 16.4s	remaining: 10.3s
162:	learn: 0.0060738	total: 16.5s	remaining: 10.2s
163:	learn: 0.0060374	total: 16.6s	remaining: 10.1s
164:	learn: 0.0059743	total: 16.6s	remaining: 9.99s
165:	learn: 0.0059169	total: 16.7s	remaining: 9.87s
166:	learn: 0.0058906	total: 16.8s	remaining: 9.77s
167:	learn: 0.0058293	total: 16.9s	remaining: 9.65s
168:	learn: 0.0058128	total: 17s	remaining: 9.53s
169:	learn: 0.0057492	total: 17s	remaining: 9.42s
170:	learn: 0.0057246	total: 17.1s	remaining: 9.31s
171:	learn: 0.0056524	total: 17.2s	remaining: 9.21s
172:	learn: 0.0056086	total: 17.3s	remaining: 9.11s
173:	learn: 0.0055615	total: 17.4s	remaining: 8.99s
174:	learn: 0.0055014	total: 17.5s	remaining: 8.88s
175:	learn: 0.0054803	total: 17.6s	remaining: 8.78s
176:	learn: 0.0053969	total: 17.6s	remaining: 8.67s
177:	learn: 0.0053404	total: 17.7s	remaining: 8.56s
178:	learn: 0.0053239	total: 17.8s	remaining: 8.45s
179:	learn: 0.0052884	total: 17.9s	remaining: 8.34s
180:	learn: 0.0052559	total: 18s	remaining: 8.23s
181:	learn: 0.0052088	total: 18.1s	remaining: 8.13s
182:	learn: 0.0051586	total: 18.1s	remaining: 8.03s
183:	learn: 0.0050976	total: 18.2s	remaining: 7.93s
184:	learn: 0.0050666	total: 18.3s	remaining: 7.83s
185:	learn: 0.0050519	total: 18.4s	remaining: 7.72s
186:	learn: 0.0050291	total: 18.5s	remaining: 7.61s
187:	learn: 0.0050018	total: 18.6s	remaining: 7.51s
188:	learn: 0.0049943	total: 18.6s	remaining: 7.39s
189:	learn: 0.0049612	total: 18.7s	remaining: 7.29s
190:	learn: 0.0049157	total: 18.8s	remaining: 7.19s
191:	learn: 0.0049058	total: 18.9s	remaining: 7.08s
192:	learn: 0.0048981	total: 18.9s	remaining: 6.97s
193:	learn: 0.0048408	total: 19s	remaining: 6.87s
194:	learn: 0.0047998	total: 19.1s	remaining: 6.76s
195:	learn: 0.0047676	total: 19.2s	remaining: 6.66s
196:	learn: 0.0047462	total: 19.3s	remaining: 6.56s
197:	learn: 0.0046707	total: 19.4s	remaining: 6.45s
198:	learn: 0.0046532	total: 19.4s	remaining: 6.35s
199:	learn: 0.0046345	total: 19.5s	remaining: 6.25s
200:	learn: 0.0045979	total: 19.6s	remaining: 6.15s
201:	learn: 0.0045979	total: 19.7s	remaining: 6.04s
202:	learn: 0.0045780	total: 19.8s	remaining: 5.94s
203:	learn: 0.0045528	total: 19.8s	remaining: 5.84s
204:	learn: 0.0045359	total: 19.9s	remaining: 5.73s
205:	learn: 0.0045018	total: 20s	remaining: 5.63s
206:	learn: 0.0044906	total: 20.1s	remaining: 5.53s
207:	learn: 0.0044790	total: 20.2s	remaining: 5.43s
208:	learn: 0.0044322	total: 20.3s	remaining: 5.33s
209:	learn: 0.0044208	total: 20.3s	remaining: 5.23s
210:	learn: 0.0043893	total: 20.4s	remaining: 5.12s
211:	learn: 0.0043567	total: 20.5s	remaining: 5.03s
212:	learn: 0.0042916	total: 20.6s	remaining: 4.93s
213:	learn: 0.0042559	total: 20.7s	remaining: 4.83s
214:	learn: 0.0042322	total: 20.7s	remaining: 4.73s
215:	learn: 0.0042142	total: 20.8s	remaining: 4.63s
216:	learn: 0.0042142	total: 20.9s	remaining: 4.52s
217:	learn: 0.0042142	total: 20.9s	remaining: 4.42s
218:	learn: 0.0041769	total: 21s	remaining: 4.32s
219:	learn: 0.0041348	total: 21.1s	remaining: 4.22s
220:	learn: 0.0041244	total: 21.2s	remaining: 4.12s
221:	learn: 0.0040966	total: 21.3s	remaining: 4.03s
222:	learn: 0.0040828	total: 21.4s	remaining: 3.93s
223:	learn: 0.0040617	total: 21.4s	remaining: 3.83s
224:	learn: 0.0040159	total: 21.5s	remaining: 3.73s
225:	learn: 0.0039882	total: 21.6s	remaining: 3.63s
226:	learn: 0.0039546	total: 21.7s	remaining: 3.53s
227:	learn: 0.0039417	total: 21.8s	remaining: 3.44s
228:	learn: 0.0039296	total: 21.8s	remaining: 3.34s
229:	learn: 0.0039023	total: 21.9s	remaining: 3.24s
230:	learn: 0.0038819	total: 22s	remaining: 3.14s
231:	learn: 0.0038683	total: 22.1s	remaining: 3.04s
232:	learn: 0.0038601	total: 22.1s	remaining: 2.94s
233:	learn: 0.0038601	total: 22.2s	remaining: 2.85s
234:	learn: 0.0038311	total: 22.3s	remaining: 2.75s
235:	learn: 0.0038136	total: 22.4s	remaining: 2.65s
236:	learn: 0.0038012	total: 22.4s	remaining: 2.56s
237:	learn: 0.0037746	total: 22.6s	remaining: 2.46s
238:	learn: 0.0037667	total: 22.6s	remaining: 2.37s
239:	learn: 0.0037666	total: 22.7s	remaining: 2.27s
240:	learn: 0.0037313	total: 22.8s	remaining: 2.17s
241:	learn: 0.0037187	total: 22.8s	remaining: 2.08s
242:	learn: 0.0037116	total: 22.9s	remaining: 1.98s
243:	learn: 0.0036899	total: 23s	remaining: 1.88s
244:	learn: 0.0036702	total: 23.1s	remaining: 1.79s
245:	learn: 0.0036363	total: 23.2s	remaining: 1.69s
246:	learn: 0.0036249	total: 23.3s	remaining: 1.6s
247:	learn: 0.0035910	total: 23.4s	remaining: 1.51s
248:	learn: 0.0035671	total: 23.6s	remaining: 1.42s
249:	learn: 0.0035475	total: 23.8s	remaining: 1.33s
250:	learn: 0.0035475	total: 23.9s	remaining: 1.24s
251:	learn: 0.0035475	total: 24s	remaining: 1.14s
252:	learn: 0.0035475	total: 24.1s	remaining: 1.05s
253:	learn: 0.0035475	total: 24.2s	remaining: 954ms
254:	learn: 0.0035475	total: 24.4s	remaining: 861ms
255:	learn: 0.0035475	total: 24.5s	remaining: 766ms
256:	learn: 0.0035475	total: 24.7s	remaining: 671ms
257:	learn: 0.0035475	total: 24.8s	remaining: 576ms
258:	learn: 0.0035475	total: 24.9s	remaining: 480ms
259:	learn: 0.0035475	total: 25s	remaining: 385ms
260:	learn: 0.0035475	total: 25.1s	remaining: 289ms
261:	learn: 0.0035475	total: 25.2s	remaining: 193ms
262:	learn: 0.0035475	total: 25.3s	remaining: 96.4ms
263:	learn: 0.0035475	total: 25.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.53
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.76
 - F1-Score_Train: 99.76
 - Precision_Test: 19.22
 - Recall_Test: 86.51
 - AUPRC_Test: 75.97
 - Accuracy_Test: 99.37
 - F1-Score_Test: 31.46
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.80
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5548287	total: 74.8ms	remaining: 19.7s
1:	learn: 0.4526924	total: 149ms	remaining: 19.6s
2:	learn: 0.3628503	total: 229ms	remaining: 19.9s
3:	learn: 0.2996388	total: 325ms	remaining: 21.1s
4:	learn: 0.2497080	total: 405ms	remaining: 21s
5:	learn: 0.2125170	total: 486ms	remaining: 20.9s
6:	learn: 0.1868980	total: 586ms	remaining: 21.5s
7:	learn: 0.1701985	total: 662ms	remaining: 21.2s
8:	learn: 0.1543880	total: 737ms	remaining: 20.9s
9:	learn: 0.1458880	total: 826ms	remaining: 21s
10:	learn: 0.1344206	total: 922ms	remaining: 21.2s
11:	learn: 0.1214733	total: 1s	remaining: 21s
12:	learn: 0.1136762	total: 1.1s	remaining: 21.2s
13:	learn: 0.1063804	total: 1.18s	remaining: 21s
14:	learn: 0.0987374	total: 1.26s	remaining: 20.9s
15:	learn: 0.0942565	total: 1.35s	remaining: 20.9s
16:	learn: 0.0894663	total: 1.44s	remaining: 20.9s
17:	learn: 0.0852952	total: 1.52s	remaining: 20.7s
18:	learn: 0.0821609	total: 1.61s	remaining: 20.8s
19:	learn: 0.0785835	total: 1.7s	remaining: 20.7s
20:	learn: 0.0757323	total: 1.78s	remaining: 20.6s
21:	learn: 0.0737982	total: 1.88s	remaining: 20.6s
22:	learn: 0.0709844	total: 1.96s	remaining: 20.6s
23:	learn: 0.0685296	total: 2.04s	remaining: 20.4s
24:	learn: 0.0659007	total: 2.14s	remaining: 20.5s
25:	learn: 0.0638230	total: 2.21s	remaining: 20.3s
26:	learn: 0.0627394	total: 2.28s	remaining: 20s
27:	learn: 0.0610053	total: 2.38s	remaining: 20s
28:	learn: 0.0600249	total: 2.45s	remaining: 19.9s
29:	learn: 0.0584140	total: 2.53s	remaining: 19.8s
30:	learn: 0.0574356	total: 2.65s	remaining: 19.9s
31:	learn: 0.0558591	total: 2.73s	remaining: 19.8s
32:	learn: 0.0544076	total: 2.81s	remaining: 19.7s
33:	learn: 0.0534996	total: 2.9s	remaining: 19.6s
34:	learn: 0.0520556	total: 2.99s	remaining: 19.6s
35:	learn: 0.0504452	total: 3.08s	remaining: 19.5s
36:	learn: 0.0493183	total: 3.17s	remaining: 19.4s
37:	learn: 0.0479216	total: 3.24s	remaining: 19.3s
38:	learn: 0.0468444	total: 3.31s	remaining: 19.1s
39:	learn: 0.0458931	total: 3.41s	remaining: 19.1s
40:	learn: 0.0447680	total: 3.49s	remaining: 19s
41:	learn: 0.0438840	total: 3.57s	remaining: 18.9s
42:	learn: 0.0431836	total: 3.66s	remaining: 18.8s
43:	learn: 0.0425207	total: 3.74s	remaining: 18.7s
44:	learn: 0.0416765	total: 3.81s	remaining: 18.6s
45:	learn: 0.0407449	total: 3.91s	remaining: 18.6s
46:	learn: 0.0399913	total: 4.01s	remaining: 18.5s
47:	learn: 0.0391064	total: 4.09s	remaining: 18.4s
48:	learn: 0.0386685	total: 4.18s	remaining: 18.3s
49:	learn: 0.0377056	total: 4.26s	remaining: 18.2s
50:	learn: 0.0368706	total: 4.34s	remaining: 18.1s
51:	learn: 0.0360466	total: 4.45s	remaining: 18.1s
52:	learn: 0.0354717	total: 4.52s	remaining: 18s
53:	learn: 0.0345135	total: 4.6s	remaining: 17.9s
54:	learn: 0.0339850	total: 4.7s	remaining: 17.9s
55:	learn: 0.0333528	total: 4.79s	remaining: 17.8s
56:	learn: 0.0326063	total: 4.87s	remaining: 17.7s
57:	learn: 0.0320436	total: 4.97s	remaining: 17.6s
58:	learn: 0.0314731	total: 5.06s	remaining: 17.6s
59:	learn: 0.0310078	total: 5.13s	remaining: 17.5s
60:	learn: 0.0305430	total: 5.23s	remaining: 17.4s
61:	learn: 0.0300460	total: 5.31s	remaining: 17.3s
62:	learn: 0.0295948	total: 5.39s	remaining: 17.2s
63:	learn: 0.0291249	total: 5.49s	remaining: 17.1s
64:	learn: 0.0286872	total: 5.57s	remaining: 17s
65:	learn: 0.0281662	total: 5.63s	remaining: 16.9s
66:	learn: 0.0278534	total: 5.73s	remaining: 16.9s
67:	learn: 0.0272262	total: 5.82s	remaining: 16.8s
68:	learn: 0.0267470	total: 5.89s	remaining: 16.7s
69:	learn: 0.0261809	total: 5.99s	remaining: 16.6s
70:	learn: 0.0259622	total: 6.08s	remaining: 16.5s
71:	learn: 0.0255420	total: 6.16s	remaining: 16.4s
72:	learn: 0.0252157	total: 6.25s	remaining: 16.4s
73:	learn: 0.0247869	total: 6.33s	remaining: 16.3s
74:	learn: 0.0245493	total: 6.41s	remaining: 16.2s
75:	learn: 0.0241730	total: 6.51s	remaining: 16.1s
76:	learn: 0.0237245	total: 6.59s	remaining: 16s
77:	learn: 0.0234066	total: 6.67s	remaining: 15.9s
78:	learn: 0.0231673	total: 6.77s	remaining: 15.8s
79:	learn: 0.0229769	total: 6.84s	remaining: 15.7s
80:	learn: 0.0226172	total: 6.91s	remaining: 15.6s
81:	learn: 0.0223591	total: 7s	remaining: 15.5s
82:	learn: 0.0219631	total: 7.1s	remaining: 15.5s
83:	learn: 0.0215555	total: 7.18s	remaining: 15.4s
84:	learn: 0.0212408	total: 7.27s	remaining: 15.3s
85:	learn: 0.0208565	total: 7.35s	remaining: 15.2s
86:	learn: 0.0205056	total: 7.43s	remaining: 15.1s
87:	learn: 0.0203304	total: 7.52s	remaining: 15s
88:	learn: 0.0200413	total: 7.6s	remaining: 14.9s
89:	learn: 0.0198292	total: 7.67s	remaining: 14.8s
90:	learn: 0.0196112	total: 7.76s	remaining: 14.8s
91:	learn: 0.0193378	total: 7.85s	remaining: 14.7s
92:	learn: 0.0189514	total: 7.93s	remaining: 14.6s
93:	learn: 0.0187365	total: 8.03s	remaining: 14.5s
94:	learn: 0.0184443	total: 8.12s	remaining: 14.4s
95:	learn: 0.0182166	total: 8.2s	remaining: 14.3s
96:	learn: 0.0180011	total: 8.28s	remaining: 14.3s
97:	learn: 0.0178068	total: 8.37s	remaining: 14.2s
98:	learn: 0.0176059	total: 8.45s	remaining: 14.1s
99:	learn: 0.0174303	total: 8.56s	remaining: 14s
100:	learn: 0.0171865	total: 8.64s	remaining: 13.9s
101:	learn: 0.0170161	total: 8.71s	remaining: 13.8s
102:	learn: 0.0168904	total: 8.8s	remaining: 13.8s
103:	learn: 0.0167943	total: 8.87s	remaining: 13.7s
104:	learn: 0.0165247	total: 8.95s	remaining: 13.5s
105:	learn: 0.0162440	total: 9.04s	remaining: 13.5s
106:	learn: 0.0160451	total: 9.13s	remaining: 13.4s
107:	learn: 0.0157621	total: 9.21s	remaining: 13.3s
108:	learn: 0.0155153	total: 9.3s	remaining: 13.2s
109:	learn: 0.0154624	total: 9.39s	remaining: 13.1s
110:	learn: 0.0151998	total: 9.47s	remaining: 13.1s
111:	learn: 0.0150845	total: 9.56s	remaining: 13s
112:	learn: 0.0148243	total: 9.66s	remaining: 12.9s
113:	learn: 0.0147305	total: 9.79s	remaining: 12.9s
114:	learn: 0.0145731	total: 9.93s	remaining: 12.9s
115:	learn: 0.0143812	total: 10.1s	remaining: 12.9s
116:	learn: 0.0142122	total: 10.2s	remaining: 12.8s
117:	learn: 0.0140637	total: 10.4s	remaining: 12.8s
118:	learn: 0.0139690	total: 10.5s	remaining: 12.8s
119:	learn: 0.0137974	total: 10.7s	remaining: 12.8s
120:	learn: 0.0135925	total: 10.8s	remaining: 12.8s
121:	learn: 0.0134689	total: 11s	remaining: 12.8s
122:	learn: 0.0133086	total: 11.1s	remaining: 12.8s
123:	learn: 0.0131086	total: 11.3s	remaining: 12.8s
124:	learn: 0.0130387	total: 11.5s	remaining: 12.8s
125:	learn: 0.0129034	total: 11.6s	remaining: 12.7s
126:	learn: 0.0127360	total: 11.8s	remaining: 12.7s
127:	learn: 0.0125761	total: 11.9s	remaining: 12.7s
128:	learn: 0.0124954	total: 12.1s	remaining: 12.7s
129:	learn: 0.0124324	total: 12.2s	remaining: 12.6s
130:	learn: 0.0123618	total: 12.4s	remaining: 12.6s
131:	learn: 0.0122424	total: 12.5s	remaining: 12.5s
132:	learn: 0.0120691	total: 12.7s	remaining: 12.5s
133:	learn: 0.0118682	total: 12.9s	remaining: 12.5s
134:	learn: 0.0117974	total: 13s	remaining: 12.4s
135:	learn: 0.0116302	total: 13.2s	remaining: 12.4s
136:	learn: 0.0115024	total: 13.3s	remaining: 12.3s
137:	learn: 0.0113601	total: 13.5s	remaining: 12.3s
138:	learn: 0.0112628	total: 13.6s	remaining: 12.3s
139:	learn: 0.0111426	total: 13.8s	remaining: 12.2s
140:	learn: 0.0109904	total: 14s	remaining: 12.2s
141:	learn: 0.0108287	total: 14.1s	remaining: 12.1s
142:	learn: 0.0107517	total: 14.2s	remaining: 12.1s
143:	learn: 0.0106517	total: 14.4s	remaining: 12s
144:	learn: 0.0105138	total: 14.5s	remaining: 11.9s
145:	learn: 0.0103890	total: 14.7s	remaining: 11.9s
146:	learn: 0.0103405	total: 14.9s	remaining: 11.8s
147:	learn: 0.0102057	total: 15s	remaining: 11.7s
148:	learn: 0.0101254	total: 15.1s	remaining: 11.6s
149:	learn: 0.0100289	total: 15.1s	remaining: 11.5s
150:	learn: 0.0099220	total: 15.2s	remaining: 11.4s
151:	learn: 0.0098505	total: 15.3s	remaining: 11.3s
152:	learn: 0.0097734	total: 15.4s	remaining: 11.2s
153:	learn: 0.0096525	total: 15.5s	remaining: 11.1s
154:	learn: 0.0095804	total: 15.6s	remaining: 11s
155:	learn: 0.0094829	total: 15.7s	remaining: 10.8s
156:	learn: 0.0094218	total: 15.7s	remaining: 10.7s
157:	learn: 0.0093435	total: 15.8s	remaining: 10.6s
158:	learn: 0.0092688	total: 15.9s	remaining: 10.5s
159:	learn: 0.0091647	total: 16s	remaining: 10.4s
160:	learn: 0.0091278	total: 16.1s	remaining: 10.3s
161:	learn: 0.0090808	total: 16.1s	remaining: 10.2s
162:	learn: 0.0089713	total: 16.2s	remaining: 10.1s
163:	learn: 0.0088526	total: 16.3s	remaining: 9.94s
164:	learn: 0.0087545	total: 16.4s	remaining: 9.84s
165:	learn: 0.0086842	total: 16.5s	remaining: 9.73s
166:	learn: 0.0086251	total: 16.6s	remaining: 9.63s
167:	learn: 0.0085473	total: 16.6s	remaining: 9.51s
168:	learn: 0.0084974	total: 16.7s	remaining: 9.41s
169:	learn: 0.0084525	total: 16.8s	remaining: 9.29s
170:	learn: 0.0084199	total: 16.9s	remaining: 9.18s
171:	learn: 0.0083286	total: 17s	remaining: 9.07s
172:	learn: 0.0082233	total: 17s	remaining: 8.97s
173:	learn: 0.0081495	total: 17.1s	remaining: 8.86s
174:	learn: 0.0080399	total: 17.2s	remaining: 8.76s
175:	learn: 0.0079852	total: 17.3s	remaining: 8.65s
176:	learn: 0.0079150	total: 17.4s	remaining: 8.55s
177:	learn: 0.0078421	total: 17.5s	remaining: 8.45s
178:	learn: 0.0077797	total: 17.6s	remaining: 8.34s
179:	learn: 0.0077078	total: 17.6s	remaining: 8.24s
180:	learn: 0.0076543	total: 17.7s	remaining: 8.13s
181:	learn: 0.0075944	total: 17.8s	remaining: 8.03s
182:	learn: 0.0075399	total: 17.9s	remaining: 7.92s
183:	learn: 0.0074401	total: 18s	remaining: 7.82s
184:	learn: 0.0073666	total: 18.1s	remaining: 7.71s
185:	learn: 0.0072909	total: 18.1s	remaining: 7.61s
186:	learn: 0.0072314	total: 18.2s	remaining: 7.51s
187:	learn: 0.0072091	total: 18.3s	remaining: 7.4s
188:	learn: 0.0071389	total: 18.4s	remaining: 7.3s
189:	learn: 0.0071102	total: 18.5s	remaining: 7.2s
190:	learn: 0.0070468	total: 18.6s	remaining: 7.09s
191:	learn: 0.0070220	total: 18.6s	remaining: 6.99s
192:	learn: 0.0069654	total: 18.7s	remaining: 6.89s
193:	learn: 0.0068988	total: 18.8s	remaining: 6.79s
194:	learn: 0.0068587	total: 18.9s	remaining: 6.68s
195:	learn: 0.0068275	total: 19s	remaining: 6.59s
196:	learn: 0.0067956	total: 19.1s	remaining: 6.48s
197:	learn: 0.0067254	total: 19.1s	remaining: 6.38s
198:	learn: 0.0066872	total: 19.2s	remaining: 6.28s
199:	learn: 0.0066454	total: 19.3s	remaining: 6.18s
200:	learn: 0.0065823	total: 19.4s	remaining: 6.08s
201:	learn: 0.0065180	total: 19.5s	remaining: 5.98s
202:	learn: 0.0064494	total: 19.6s	remaining: 5.88s
203:	learn: 0.0064182	total: 19.7s	remaining: 5.78s
204:	learn: 0.0063985	total: 19.7s	remaining: 5.68s
205:	learn: 0.0063376	total: 19.8s	remaining: 5.58s
206:	learn: 0.0062827	total: 19.9s	remaining: 5.48s
207:	learn: 0.0062184	total: 20s	remaining: 5.39s
208:	learn: 0.0061717	total: 20.1s	remaining: 5.29s
209:	learn: 0.0061532	total: 20.2s	remaining: 5.18s
210:	learn: 0.0061293	total: 20.2s	remaining: 5.08s
211:	learn: 0.0060704	total: 20.3s	remaining: 4.99s
212:	learn: 0.0060400	total: 20.4s	remaining: 4.88s
213:	learn: 0.0060208	total: 20.5s	remaining: 4.79s
214:	learn: 0.0059908	total: 20.6s	remaining: 4.69s
215:	learn: 0.0059568	total: 20.7s	remaining: 4.59s
216:	learn: 0.0059025	total: 20.8s	remaining: 4.49s
217:	learn: 0.0058734	total: 20.8s	remaining: 4.39s
218:	learn: 0.0058343	total: 20.9s	remaining: 4.29s
219:	learn: 0.0058095	total: 21s	remaining: 4.2s
220:	learn: 0.0057310	total: 21.1s	remaining: 4.1s
221:	learn: 0.0056718	total: 21.1s	remaining: 4s
222:	learn: 0.0056198	total: 21.2s	remaining: 3.9s
223:	learn: 0.0055605	total: 21.3s	remaining: 3.81s
224:	learn: 0.0055282	total: 21.4s	remaining: 3.71s
225:	learn: 0.0054917	total: 21.5s	remaining: 3.61s
226:	learn: 0.0054519	total: 21.6s	remaining: 3.52s
227:	learn: 0.0054245	total: 21.7s	remaining: 3.42s
228:	learn: 0.0054115	total: 21.7s	remaining: 3.32s
229:	learn: 0.0053676	total: 21.8s	remaining: 3.23s
230:	learn: 0.0053676	total: 21.9s	remaining: 3.13s
231:	learn: 0.0053242	total: 22s	remaining: 3.03s
232:	learn: 0.0053242	total: 22.1s	remaining: 2.93s
233:	learn: 0.0053084	total: 22.1s	remaining: 2.84s
234:	learn: 0.0052608	total: 22.2s	remaining: 2.74s
235:	learn: 0.0052608	total: 22.3s	remaining: 2.64s
236:	learn: 0.0052608	total: 22.3s	remaining: 2.54s
237:	learn: 0.0052608	total: 22.4s	remaining: 2.45s
238:	learn: 0.0052608	total: 22.5s	remaining: 2.35s
239:	learn: 0.0052608	total: 22.5s	remaining: 2.25s
240:	learn: 0.0052608	total: 22.6s	remaining: 2.16s
241:	learn: 0.0052607	total: 22.7s	remaining: 2.06s
242:	learn: 0.0052607	total: 22.8s	remaining: 1.97s
243:	learn: 0.0052607	total: 22.8s	remaining: 1.87s
244:	learn: 0.0052607	total: 22.9s	remaining: 1.77s
245:	learn: 0.0052606	total: 22.9s	remaining: 1.68s
246:	learn: 0.0052606	total: 23s	remaining: 1.58s
247:	learn: 0.0052606	total: 23.1s	remaining: 1.49s
248:	learn: 0.0052606	total: 23.2s	remaining: 1.4s
249:	learn: 0.0052606	total: 23.2s	remaining: 1.3s
250:	learn: 0.0052606	total: 23.3s	remaining: 1.21s
251:	learn: 0.0052472	total: 23.4s	remaining: 1.11s
252:	learn: 0.0052090	total: 23.5s	remaining: 1.02s
253:	learn: 0.0051543	total: 23.5s	remaining: 927ms
254:	learn: 0.0051103	total: 23.6s	remaining: 834ms
255:	learn: 0.0050808	total: 23.7s	remaining: 741ms
256:	learn: 0.0050456	total: 23.8s	remaining: 649ms
257:	learn: 0.0050261	total: 23.9s	remaining: 555ms
258:	learn: 0.0050260	total: 23.9s	remaining: 462ms
259:	learn: 0.0050259	total: 24s	remaining: 369ms
260:	learn: 0.0050259	total: 24.1s	remaining: 277ms
261:	learn: 0.0049938	total: 24.2s	remaining: 184ms
262:	learn: 0.0049492	total: 24.2s	remaining: 92.2ms
263:	learn: 0.0049491	total: 24.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.37
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.68
 - F1-Score_Train: 99.68
 - Precision_Test: 18.27
 - Recall_Test: 88.89
 - AUPRC_Test: 73.53
 - Accuracy_Test: 99.31
 - F1-Score_Test: 30.31
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.80
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5478250	total: 154ms	remaining: 40.5s
1:	learn: 0.4213744	total: 302ms	remaining: 39.6s
2:	learn: 0.3466952	total: 458ms	remaining: 39.8s
3:	learn: 0.2882748	total: 608ms	remaining: 39.5s
4:	learn: 0.2327166	total: 780ms	remaining: 40.4s
5:	learn: 0.2028769	total: 930ms	remaining: 40s
6:	learn: 0.1792657	total: 1.08s	remaining: 39.8s
7:	learn: 0.1568237	total: 1.23s	remaining: 39.4s
8:	learn: 0.1430051	total: 1.38s	remaining: 39.2s
9:	learn: 0.1313624	total: 1.51s	remaining: 38.4s
10:	learn: 0.1202216	total: 1.67s	remaining: 38.3s
11:	learn: 0.1131643	total: 1.81s	remaining: 38s
12:	learn: 0.1056559	total: 1.95s	remaining: 37.7s
13:	learn: 0.0997196	total: 2.1s	remaining: 37.5s
14:	learn: 0.0936003	total: 2.26s	remaining: 37.5s
15:	learn: 0.0879110	total: 2.41s	remaining: 37.4s
16:	learn: 0.0830192	total: 2.58s	remaining: 37.5s
17:	learn: 0.0773047	total: 2.73s	remaining: 37.4s
18:	learn: 0.0739804	total: 2.9s	remaining: 37.4s
19:	learn: 0.0702454	total: 3.05s	remaining: 37.2s
20:	learn: 0.0674983	total: 3.21s	remaining: 37.1s
21:	learn: 0.0638642	total: 3.36s	remaining: 36.9s
22:	learn: 0.0619449	total: 3.45s	remaining: 36.2s
23:	learn: 0.0596041	total: 3.53s	remaining: 35.3s
24:	learn: 0.0583459	total: 3.6s	remaining: 34.5s
25:	learn: 0.0564560	total: 3.72s	remaining: 34.1s
26:	learn: 0.0546169	total: 3.8s	remaining: 33.4s
27:	learn: 0.0539012	total: 3.88s	remaining: 32.7s
28:	learn: 0.0519365	total: 3.99s	remaining: 32.3s
29:	learn: 0.0505249	total: 4.07s	remaining: 31.7s
30:	learn: 0.0490914	total: 4.15s	remaining: 31.2s
31:	learn: 0.0476216	total: 4.25s	remaining: 30.8s
32:	learn: 0.0462901	total: 4.33s	remaining: 30.3s
33:	learn: 0.0454104	total: 4.4s	remaining: 29.8s
34:	learn: 0.0442319	total: 4.5s	remaining: 29.4s
35:	learn: 0.0428545	total: 4.58s	remaining: 29s
36:	learn: 0.0420867	total: 4.65s	remaining: 28.5s
37:	learn: 0.0412913	total: 4.74s	remaining: 28.2s
38:	learn: 0.0403576	total: 4.82s	remaining: 27.8s
39:	learn: 0.0395492	total: 4.89s	remaining: 27.4s
40:	learn: 0.0388580	total: 4.99s	remaining: 27.2s
41:	learn: 0.0381202	total: 5.07s	remaining: 26.8s
42:	learn: 0.0373374	total: 5.16s	remaining: 26.5s
43:	learn: 0.0366680	total: 5.25s	remaining: 26.3s
44:	learn: 0.0358524	total: 5.33s	remaining: 25.9s
45:	learn: 0.0349217	total: 5.41s	remaining: 25.6s
46:	learn: 0.0342461	total: 5.5s	remaining: 25.4s
47:	learn: 0.0334514	total: 5.59s	remaining: 25.1s
48:	learn: 0.0330240	total: 5.67s	remaining: 24.9s
49:	learn: 0.0322017	total: 5.76s	remaining: 24.7s
50:	learn: 0.0314983	total: 5.85s	remaining: 24.4s
51:	learn: 0.0307883	total: 5.93s	remaining: 24.2s
52:	learn: 0.0299524	total: 6.04s	remaining: 24s
53:	learn: 0.0295552	total: 6.12s	remaining: 23.8s
54:	learn: 0.0292529	total: 6.19s	remaining: 23.5s
55:	learn: 0.0287699	total: 6.28s	remaining: 23.3s
56:	learn: 0.0282316	total: 6.36s	remaining: 23.1s
57:	learn: 0.0277728	total: 6.44s	remaining: 22.9s
58:	learn: 0.0271523	total: 6.55s	remaining: 22.8s
59:	learn: 0.0266866	total: 6.63s	remaining: 22.5s
60:	learn: 0.0262460	total: 6.72s	remaining: 22.4s
61:	learn: 0.0257631	total: 6.81s	remaining: 22.2s
62:	learn: 0.0253265	total: 6.9s	remaining: 22s
63:	learn: 0.0250700	total: 6.99s	remaining: 21.8s
64:	learn: 0.0246243	total: 7.09s	remaining: 21.7s
65:	learn: 0.0243363	total: 7.17s	remaining: 21.5s
66:	learn: 0.0239577	total: 7.25s	remaining: 21.3s
67:	learn: 0.0233963	total: 7.35s	remaining: 21.2s
68:	learn: 0.0229400	total: 7.43s	remaining: 21s
69:	learn: 0.0226048	total: 7.52s	remaining: 20.8s
70:	learn: 0.0223616	total: 7.61s	remaining: 20.7s
71:	learn: 0.0219250	total: 7.69s	remaining: 20.5s
72:	learn: 0.0215203	total: 7.77s	remaining: 20.3s
73:	learn: 0.0212928	total: 7.86s	remaining: 20.2s
74:	learn: 0.0210381	total: 7.94s	remaining: 20s
75:	learn: 0.0208374	total: 8.03s	remaining: 19.9s
76:	learn: 0.0205643	total: 8.13s	remaining: 19.7s
77:	learn: 0.0203778	total: 8.21s	remaining: 19.6s
78:	learn: 0.0200029	total: 8.29s	remaining: 19.4s
79:	learn: 0.0197307	total: 8.38s	remaining: 19.3s
80:	learn: 0.0195520	total: 8.46s	remaining: 19.1s
81:	learn: 0.0192360	total: 8.54s	remaining: 18.9s
82:	learn: 0.0189770	total: 8.63s	remaining: 18.8s
83:	learn: 0.0186724	total: 8.7s	remaining: 18.6s
84:	learn: 0.0184088	total: 8.77s	remaining: 18.5s
85:	learn: 0.0182228	total: 8.85s	remaining: 18.3s
86:	learn: 0.0180449	total: 8.93s	remaining: 18.2s
87:	learn: 0.0177835	total: 9.02s	remaining: 18s
88:	learn: 0.0176046	total: 9.12s	remaining: 17.9s
89:	learn: 0.0173400	total: 9.2s	remaining: 17.8s
90:	learn: 0.0171184	total: 9.27s	remaining: 17.6s
91:	learn: 0.0169106	total: 9.38s	remaining: 17.5s
92:	learn: 0.0167300	total: 9.46s	remaining: 17.4s
93:	learn: 0.0165268	total: 9.54s	remaining: 17.2s
94:	learn: 0.0164215	total: 9.63s	remaining: 17.1s
95:	learn: 0.0161614	total: 9.71s	remaining: 17s
96:	learn: 0.0160283	total: 9.79s	remaining: 16.9s
97:	learn: 0.0157233	total: 9.88s	remaining: 16.7s
98:	learn: 0.0154963	total: 9.97s	remaining: 16.6s
99:	learn: 0.0152263	total: 10.1s	remaining: 16.5s
100:	learn: 0.0151156	total: 10.2s	remaining: 16.4s
101:	learn: 0.0149988	total: 10.2s	remaining: 16.2s
102:	learn: 0.0148498	total: 10.3s	remaining: 16.1s
103:	learn: 0.0147143	total: 10.4s	remaining: 16s
104:	learn: 0.0146065	total: 10.5s	remaining: 15.8s
105:	learn: 0.0144313	total: 10.5s	remaining: 15.7s
106:	learn: 0.0142113	total: 10.6s	remaining: 15.6s
107:	learn: 0.0140592	total: 10.7s	remaining: 15.5s
108:	learn: 0.0139083	total: 10.8s	remaining: 15.3s
109:	learn: 0.0136839	total: 10.9s	remaining: 15.2s
110:	learn: 0.0135945	total: 11s	remaining: 15.1s
111:	learn: 0.0134011	total: 11s	remaining: 15s
112:	learn: 0.0133102	total: 11.1s	remaining: 14.9s
113:	learn: 0.0131390	total: 11.2s	remaining: 14.8s
114:	learn: 0.0130242	total: 11.3s	remaining: 14.6s
115:	learn: 0.0128579	total: 11.4s	remaining: 14.5s
116:	learn: 0.0126579	total: 11.5s	remaining: 14.4s
117:	learn: 0.0124763	total: 11.5s	remaining: 14.3s
118:	learn: 0.0122970	total: 11.6s	remaining: 14.2s
119:	learn: 0.0121352	total: 11.7s	remaining: 14.1s
120:	learn: 0.0119809	total: 11.8s	remaining: 14s
121:	learn: 0.0118884	total: 11.9s	remaining: 13.9s
122:	learn: 0.0117673	total: 12s	remaining: 13.7s
123:	learn: 0.0116392	total: 12.1s	remaining: 13.6s
124:	learn: 0.0115429	total: 12.2s	remaining: 13.5s
125:	learn: 0.0114490	total: 12.3s	remaining: 13.4s
126:	learn: 0.0113055	total: 12.3s	remaining: 13.3s
127:	learn: 0.0111180	total: 12.4s	remaining: 13.2s
128:	learn: 0.0109776	total: 12.5s	remaining: 13.1s
129:	learn: 0.0108788	total: 12.6s	remaining: 13s
130:	learn: 0.0108025	total: 12.7s	remaining: 12.9s
131:	learn: 0.0107548	total: 12.8s	remaining: 12.8s
132:	learn: 0.0105791	total: 12.8s	remaining: 12.6s
133:	learn: 0.0104519	total: 12.9s	remaining: 12.5s
134:	learn: 0.0103819	total: 13s	remaining: 12.4s
135:	learn: 0.0102944	total: 13.1s	remaining: 12.3s
136:	learn: 0.0101481	total: 13.2s	remaining: 12.2s
137:	learn: 0.0100367	total: 13.3s	remaining: 12.1s
138:	learn: 0.0099386	total: 13.4s	remaining: 12s
139:	learn: 0.0098514	total: 13.5s	remaining: 12s
140:	learn: 0.0097829	total: 13.7s	remaining: 11.9s
141:	learn: 0.0096883	total: 13.8s	remaining: 11.9s
142:	learn: 0.0095963	total: 14s	remaining: 11.8s
143:	learn: 0.0094726	total: 14.2s	remaining: 11.8s
144:	learn: 0.0093919	total: 14.3s	remaining: 11.7s
145:	learn: 0.0092821	total: 14.5s	remaining: 11.7s
146:	learn: 0.0091791	total: 14.6s	remaining: 11.6s
147:	learn: 0.0090633	total: 14.8s	remaining: 11.6s
148:	learn: 0.0089473	total: 14.9s	remaining: 11.5s
149:	learn: 0.0088577	total: 15.1s	remaining: 11.5s
150:	learn: 0.0087638	total: 15.2s	remaining: 11.4s
151:	learn: 0.0086729	total: 15.4s	remaining: 11.3s
152:	learn: 0.0085945	total: 15.5s	remaining: 11.3s
153:	learn: 0.0084756	total: 15.7s	remaining: 11.2s
154:	learn: 0.0083609	total: 15.9s	remaining: 11.2s
155:	learn: 0.0082926	total: 16s	remaining: 11.1s
156:	learn: 0.0082400	total: 16.2s	remaining: 11s
157:	learn: 0.0081809	total: 16.4s	remaining: 11s
158:	learn: 0.0081253	total: 16.5s	remaining: 10.9s
159:	learn: 0.0080698	total: 16.7s	remaining: 10.8s
160:	learn: 0.0080090	total: 16.8s	remaining: 10.8s
161:	learn: 0.0079176	total: 17s	remaining: 10.7s
162:	learn: 0.0078792	total: 17.1s	remaining: 10.6s
163:	learn: 0.0078076	total: 17.3s	remaining: 10.5s
164:	learn: 0.0077535	total: 17.4s	remaining: 10.4s
165:	learn: 0.0076737	total: 17.6s	remaining: 10.4s
166:	learn: 0.0076161	total: 17.8s	remaining: 10.3s
167:	learn: 0.0075019	total: 17.9s	remaining: 10.2s
168:	learn: 0.0074154	total: 18.1s	remaining: 10.2s
169:	learn: 0.0073795	total: 18.2s	remaining: 10.1s
170:	learn: 0.0072740	total: 18.4s	remaining: 10s
171:	learn: 0.0072051	total: 18.6s	remaining: 9.93s
172:	learn: 0.0071252	total: 18.7s	remaining: 9.84s
173:	learn: 0.0070567	total: 18.9s	remaining: 9.77s
174:	learn: 0.0070403	total: 19s	remaining: 9.68s
175:	learn: 0.0069697	total: 19.1s	remaining: 9.56s
176:	learn: 0.0069294	total: 19.2s	remaining: 9.44s
177:	learn: 0.0068887	total: 19.3s	remaining: 9.32s
178:	learn: 0.0068095	total: 19.4s	remaining: 9.2s
179:	learn: 0.0067929	total: 19.5s	remaining: 9.08s
180:	learn: 0.0067114	total: 19.5s	remaining: 8.96s
181:	learn: 0.0066816	total: 19.6s	remaining: 8.85s
182:	learn: 0.0066193	total: 19.7s	remaining: 8.73s
183:	learn: 0.0065908	total: 19.8s	remaining: 8.61s
184:	learn: 0.0065465	total: 19.9s	remaining: 8.49s
185:	learn: 0.0064949	total: 20s	remaining: 8.37s
186:	learn: 0.0064334	total: 20s	remaining: 8.25s
187:	learn: 0.0063547	total: 20.1s	remaining: 8.14s
188:	learn: 0.0063043	total: 20.2s	remaining: 8.02s
189:	learn: 0.0062742	total: 20.3s	remaining: 7.9s
190:	learn: 0.0062278	total: 20.4s	remaining: 7.79s
191:	learn: 0.0062044	total: 20.5s	remaining: 7.67s
192:	learn: 0.0061440	total: 20.5s	remaining: 7.56s
193:	learn: 0.0060947	total: 20.6s	remaining: 7.45s
194:	learn: 0.0060249	total: 20.7s	remaining: 7.33s
195:	learn: 0.0059681	total: 20.8s	remaining: 7.22s
196:	learn: 0.0059357	total: 20.9s	remaining: 7.11s
197:	learn: 0.0058951	total: 21s	remaining: 6.99s
198:	learn: 0.0058714	total: 21.1s	remaining: 6.88s
199:	learn: 0.0058436	total: 21.2s	remaining: 6.77s
200:	learn: 0.0057851	total: 21.2s	remaining: 6.66s
201:	learn: 0.0057697	total: 21.3s	remaining: 6.54s
202:	learn: 0.0057231	total: 21.4s	remaining: 6.43s
203:	learn: 0.0056824	total: 21.5s	remaining: 6.32s
204:	learn: 0.0056533	total: 21.6s	remaining: 6.21s
205:	learn: 0.0056069	total: 21.7s	remaining: 6.11s
206:	learn: 0.0055726	total: 21.8s	remaining: 5.99s
207:	learn: 0.0055409	total: 21.8s	remaining: 5.88s
208:	learn: 0.0054934	total: 21.9s	remaining: 5.77s
209:	learn: 0.0054550	total: 22s	remaining: 5.66s
210:	learn: 0.0054104	total: 22.1s	remaining: 5.55s
211:	learn: 0.0053857	total: 22.2s	remaining: 5.44s
212:	learn: 0.0053411	total: 22.3s	remaining: 5.33s
213:	learn: 0.0052980	total: 22.3s	remaining: 5.22s
214:	learn: 0.0052540	total: 22.4s	remaining: 5.11s
215:	learn: 0.0052170	total: 22.5s	remaining: 5s
216:	learn: 0.0051790	total: 22.6s	remaining: 4.89s
217:	learn: 0.0051549	total: 22.7s	remaining: 4.79s
218:	learn: 0.0051415	total: 22.8s	remaining: 4.68s
219:	learn: 0.0051056	total: 22.8s	remaining: 4.57s
220:	learn: 0.0050817	total: 22.9s	remaining: 4.46s
221:	learn: 0.0050593	total: 23s	remaining: 4.35s
222:	learn: 0.0050006	total: 23.1s	remaining: 4.24s
223:	learn: 0.0049690	total: 23.2s	remaining: 4.14s
224:	learn: 0.0049181	total: 23.3s	remaining: 4.03s
225:	learn: 0.0048911	total: 23.3s	remaining: 3.92s
226:	learn: 0.0048429	total: 23.4s	remaining: 3.82s
227:	learn: 0.0048206	total: 23.5s	remaining: 3.71s
228:	learn: 0.0047707	total: 23.6s	remaining: 3.61s
229:	learn: 0.0047707	total: 23.7s	remaining: 3.5s
230:	learn: 0.0047579	total: 23.7s	remaining: 3.39s
231:	learn: 0.0047253	total: 23.8s	remaining: 3.29s
232:	learn: 0.0046932	total: 23.9s	remaining: 3.18s
233:	learn: 0.0046633	total: 24s	remaining: 3.08s
234:	learn: 0.0046276	total: 24.1s	remaining: 2.97s
235:	learn: 0.0046274	total: 24.1s	remaining: 2.87s
236:	learn: 0.0046177	total: 24.2s	remaining: 2.76s
237:	learn: 0.0046177	total: 24.3s	remaining: 2.65s
238:	learn: 0.0045803	total: 24.4s	remaining: 2.55s
239:	learn: 0.0045719	total: 24.5s	remaining: 2.44s
240:	learn: 0.0045326	total: 24.5s	remaining: 2.34s
241:	learn: 0.0044791	total: 24.6s	remaining: 2.24s
242:	learn: 0.0044427	total: 24.7s	remaining: 2.14s
243:	learn: 0.0044155	total: 24.8s	remaining: 2.03s
244:	learn: 0.0043739	total: 24.9s	remaining: 1.93s
245:	learn: 0.0043533	total: 25s	remaining: 1.83s
246:	learn: 0.0043360	total: 25s	remaining: 1.72s
247:	learn: 0.0043018	total: 25.1s	remaining: 1.62s
248:	learn: 0.0042691	total: 25.2s	remaining: 1.52s
249:	learn: 0.0042420	total: 25.3s	remaining: 1.42s
250:	learn: 0.0042061	total: 25.4s	remaining: 1.31s
251:	learn: 0.0041767	total: 25.5s	remaining: 1.21s
252:	learn: 0.0041600	total: 25.6s	remaining: 1.11s
253:	learn: 0.0041566	total: 25.6s	remaining: 1.01s
254:	learn: 0.0041320	total: 25.7s	remaining: 908ms
255:	learn: 0.0041028	total: 25.8s	remaining: 807ms
256:	learn: 0.0040704	total: 25.9s	remaining: 706ms
257:	learn: 0.0040559	total: 26s	remaining: 605ms
258:	learn: 0.0040294	total: 26.1s	remaining: 504ms
259:	learn: 0.0040075	total: 26.2s	remaining: 403ms
260:	learn: 0.0039794	total: 26.3s	remaining: 302ms
261:	learn: 0.0039707	total: 26.4s	remaining: 201ms
262:	learn: 0.0039556	total: 26.5s	remaining: 101ms
263:	learn: 0.0039526	total: 26.6s	remaining: 0us
[I 2024-12-19 14:38:03,839] Trial 25 finished with value: 74.93209955678623 and parameters: {'learning_rate': 0.06170667436486901, 'max_depth': 5, 'n_estimators': 264, 'scale_pos_weight': 7.802165848882661}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.44
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.72
 - F1-Score_Train: 99.72
 - Precision_Test: 18.89
 - Recall_Test: 86.51
 - AUPRC_Test: 75.30
 - Accuracy_Test: 99.35
 - F1-Score_Test: 31.01
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 264
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.80
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 74.9321

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5866410	total: 189ms	remaining: 43.5s
1:	learn: 0.4952628	total: 431ms	remaining: 49.3s
2:	learn: 0.4090996	total: 649ms	remaining: 49.4s
3:	learn: 0.3403545	total: 843ms	remaining: 47.8s
4:	learn: 0.2845607	total: 1.05s	remaining: 47.3s
5:	learn: 0.2378555	total: 1.24s	remaining: 46.5s
6:	learn: 0.2035385	total: 1.44s	remaining: 46.1s
7:	learn: 0.1890474	total: 1.62s	remaining: 45.1s
8:	learn: 0.1646969	total: 1.85s	remaining: 45.6s
9:	learn: 0.1426878	total: 2.06s	remaining: 45.5s
10:	learn: 0.1250700	total: 2.24s	remaining: 44.8s
11:	learn: 0.1141363	total: 2.43s	remaining: 44.4s
12:	learn: 0.1035791	total: 2.63s	remaining: 44.1s
13:	learn: 0.0941647	total: 2.83s	remaining: 43.9s
14:	learn: 0.0877936	total: 3.02s	remaining: 43.5s
15:	learn: 0.0810574	total: 3.19s	remaining: 42.9s
16:	learn: 0.0750010	total: 3.38s	remaining: 42.6s
17:	learn: 0.0715645	total: 3.55s	remaining: 42.1s
18:	learn: 0.0669073	total: 3.73s	remaining: 41.6s
19:	learn: 0.0623079	total: 3.95s	remaining: 41.7s
20:	learn: 0.0587193	total: 4.14s	remaining: 41.4s
21:	learn: 0.0554283	total: 4.34s	remaining: 41.2s
22:	learn: 0.0523805	total: 4.52s	remaining: 40.9s
23:	learn: 0.0495076	total: 4.7s	remaining: 40.6s
24:	learn: 0.0476564	total: 4.87s	remaining: 40.1s
25:	learn: 0.0457251	total: 5.05s	remaining: 39.8s
26:	learn: 0.0438381	total: 5.24s	remaining: 39.6s
27:	learn: 0.0421911	total: 5.44s	remaining: 39.4s
28:	learn: 0.0408297	total: 5.6s	remaining: 39s
29:	learn: 0.0395301	total: 5.79s	remaining: 38.8s
30:	learn: 0.0381516	total: 5.93s	remaining: 38.3s
31:	learn: 0.0371217	total: 6.03s	remaining: 37.5s
32:	learn: 0.0360377	total: 6.13s	remaining: 36.8s
33:	learn: 0.0353615	total: 6.22s	remaining: 36.1s
34:	learn: 0.0346294	total: 6.32s	remaining: 35.4s
35:	learn: 0.0334362	total: 6.41s	remaining: 34.7s
36:	learn: 0.0324267	total: 6.5s	remaining: 34.1s
37:	learn: 0.0314899	total: 6.6s	remaining: 33.5s
38:	learn: 0.0308250	total: 6.69s	remaining: 32.9s
39:	learn: 0.0298059	total: 6.79s	remaining: 32.4s
40:	learn: 0.0290997	total: 6.91s	remaining: 32s
41:	learn: 0.0285051	total: 7s	remaining: 31.5s
42:	learn: 0.0279052	total: 7.09s	remaining: 31s
43:	learn: 0.0272723	total: 7.21s	remaining: 30.7s
44:	learn: 0.0268370	total: 7.3s	remaining: 30.2s
45:	learn: 0.0263959	total: 7.38s	remaining: 29.7s
46:	learn: 0.0258186	total: 7.49s	remaining: 29.3s
47:	learn: 0.0251795	total: 7.58s	remaining: 28.9s
48:	learn: 0.0248280	total: 7.67s	remaining: 28.5s
49:	learn: 0.0243924	total: 7.78s	remaining: 28.2s
50:	learn: 0.0239398	total: 7.87s	remaining: 27.8s
51:	learn: 0.0235842	total: 7.96s	remaining: 27.4s
52:	learn: 0.0232425	total: 8.06s	remaining: 27.1s
53:	learn: 0.0227511	total: 8.15s	remaining: 26.7s
54:	learn: 0.0223894	total: 8.25s	remaining: 26.4s
55:	learn: 0.0219525	total: 8.36s	remaining: 26.1s
56:	learn: 0.0215590	total: 8.45s	remaining: 25.8s
57:	learn: 0.0211388	total: 8.54s	remaining: 25.5s
58:	learn: 0.0208373	total: 8.63s	remaining: 25.2s
59:	learn: 0.0206326	total: 8.72s	remaining: 24.8s
60:	learn: 0.0203851	total: 8.81s	remaining: 24.6s
61:	learn: 0.0200218	total: 8.92s	remaining: 24.3s
62:	learn: 0.0197141	total: 9s	remaining: 24s
63:	learn: 0.0194962	total: 9.08s	remaining: 23.7s
64:	learn: 0.0192928	total: 9.19s	remaining: 23.5s
65:	learn: 0.0188890	total: 9.29s	remaining: 23.2s
66:	learn: 0.0185483	total: 9.38s	remaining: 23s
67:	learn: 0.0183249	total: 9.48s	remaining: 22.7s
68:	learn: 0.0180452	total: 9.56s	remaining: 22.4s
69:	learn: 0.0177444	total: 9.66s	remaining: 22.2s
70:	learn: 0.0175994	total: 9.76s	remaining: 22s
71:	learn: 0.0174186	total: 9.85s	remaining: 21.8s
72:	learn: 0.0171399	total: 9.93s	remaining: 21.5s
73:	learn: 0.0167868	total: 10s	remaining: 21.3s
74:	learn: 0.0165968	total: 10.1s	remaining: 21.1s
75:	learn: 0.0163319	total: 10.2s	remaining: 20.9s
76:	learn: 0.0161779	total: 10.3s	remaining: 20.7s
77:	learn: 0.0159976	total: 10.4s	remaining: 20.5s
78:	learn: 0.0157043	total: 10.5s	remaining: 20.2s
79:	learn: 0.0154245	total: 10.6s	remaining: 20.1s
80:	learn: 0.0151912	total: 10.7s	remaining: 19.9s
81:	learn: 0.0150176	total: 10.8s	remaining: 19.6s
82:	learn: 0.0147185	total: 10.9s	remaining: 19.5s
83:	learn: 0.0145686	total: 11s	remaining: 19.3s
84:	learn: 0.0144090	total: 11.1s	remaining: 19.1s
85:	learn: 0.0142327	total: 11.2s	remaining: 18.9s
86:	learn: 0.0140188	total: 11.3s	remaining: 18.7s
87:	learn: 0.0138138	total: 11.4s	remaining: 18.5s
88:	learn: 0.0137125	total: 11.5s	remaining: 18.3s
89:	learn: 0.0135314	total: 11.6s	remaining: 18.1s
90:	learn: 0.0133215	total: 11.7s	remaining: 18s
91:	learn: 0.0130820	total: 11.8s	remaining: 17.8s
92:	learn: 0.0128805	total: 11.9s	remaining: 17.6s
93:	learn: 0.0126919	total: 12s	remaining: 17.4s
94:	learn: 0.0125595	total: 12.1s	remaining: 17.3s
95:	learn: 0.0124142	total: 12.2s	remaining: 17.1s
96:	learn: 0.0122572	total: 12.2s	remaining: 16.9s
97:	learn: 0.0121191	total: 12.4s	remaining: 16.8s
98:	learn: 0.0119944	total: 12.4s	remaining: 16.6s
99:	learn: 0.0117895	total: 12.5s	remaining: 16.4s
100:	learn: 0.0116869	total: 12.7s	remaining: 16.3s
101:	learn: 0.0115561	total: 12.7s	remaining: 16.1s
102:	learn: 0.0113555	total: 12.8s	remaining: 15.9s
103:	learn: 0.0112354	total: 12.9s	remaining: 15.8s
104:	learn: 0.0110859	total: 13s	remaining: 15.6s
105:	learn: 0.0109896	total: 13.1s	remaining: 15.5s
106:	learn: 0.0108157	total: 13.2s	remaining: 15.3s
107:	learn: 0.0107071	total: 13.3s	remaining: 15.1s
108:	learn: 0.0105881	total: 13.4s	remaining: 15s
109:	learn: 0.0104493	total: 13.5s	remaining: 14.9s
110:	learn: 0.0102861	total: 13.6s	remaining: 14.7s
111:	learn: 0.0102012	total: 13.7s	remaining: 14.5s
112:	learn: 0.0100810	total: 13.8s	remaining: 14.4s
113:	learn: 0.0099771	total: 13.9s	remaining: 14.3s
114:	learn: 0.0098911	total: 14s	remaining: 14.1s
115:	learn: 0.0097858	total: 14.1s	remaining: 14s
116:	learn: 0.0096995	total: 14.2s	remaining: 13.8s
117:	learn: 0.0096303	total: 14.2s	remaining: 13.6s
118:	learn: 0.0094995	total: 14.3s	remaining: 13.5s
119:	learn: 0.0093944	total: 14.4s	remaining: 13.4s
120:	learn: 0.0092748	total: 14.5s	remaining: 13.2s
121:	learn: 0.0091934	total: 14.6s	remaining: 13.1s
122:	learn: 0.0090934	total: 14.7s	remaining: 12.9s
123:	learn: 0.0090129	total: 14.8s	remaining: 12.8s
124:	learn: 0.0089009	total: 14.9s	remaining: 12.7s
125:	learn: 0.0088049	total: 15s	remaining: 12.5s
126:	learn: 0.0087141	total: 15.1s	remaining: 12.4s
127:	learn: 0.0086429	total: 15.2s	remaining: 12.2s
128:	learn: 0.0085601	total: 15.3s	remaining: 12.1s
129:	learn: 0.0084577	total: 15.4s	remaining: 11.9s
130:	learn: 0.0083791	total: 15.5s	remaining: 11.8s
131:	learn: 0.0083428	total: 15.6s	remaining: 11.7s
132:	learn: 0.0082759	total: 15.7s	remaining: 11.5s
133:	learn: 0.0081987	total: 15.8s	remaining: 11.4s
134:	learn: 0.0081079	total: 15.9s	remaining: 11.3s
135:	learn: 0.0080197	total: 16s	remaining: 11.2s
136:	learn: 0.0079735	total: 16.2s	remaining: 11.1s
137:	learn: 0.0079469	total: 16.3s	remaining: 11s
138:	learn: 0.0078715	total: 16.5s	remaining: 10.9s
139:	learn: 0.0078126	total: 16.7s	remaining: 10.8s
140:	learn: 0.0077568	total: 16.8s	remaining: 10.7s
141:	learn: 0.0076940	total: 17s	remaining: 10.7s
142:	learn: 0.0076199	total: 17.2s	remaining: 10.6s
143:	learn: 0.0075582	total: 17.4s	remaining: 10.5s
144:	learn: 0.0074981	total: 17.6s	remaining: 10.4s
145:	learn: 0.0074337	total: 17.8s	remaining: 10.3s
146:	learn: 0.0073720	total: 17.9s	remaining: 10.2s
147:	learn: 0.0072897	total: 18.1s	remaining: 10.1s
148:	learn: 0.0071961	total: 18.3s	remaining: 10.1s
149:	learn: 0.0071344	total: 18.4s	remaining: 9.96s
150:	learn: 0.0070490	total: 18.6s	remaining: 9.87s
151:	learn: 0.0070106	total: 18.8s	remaining: 9.76s
152:	learn: 0.0069372	total: 18.9s	remaining: 9.66s
153:	learn: 0.0068768	total: 19.1s	remaining: 9.56s
154:	learn: 0.0068340	total: 19.3s	remaining: 9.46s
155:	learn: 0.0067636	total: 19.5s	remaining: 9.35s
156:	learn: 0.0067093	total: 19.6s	remaining: 9.24s
157:	learn: 0.0066810	total: 19.8s	remaining: 9.14s
158:	learn: 0.0066206	total: 19.9s	remaining: 9.03s
159:	learn: 0.0065604	total: 20.1s	remaining: 8.93s
160:	learn: 0.0064992	total: 20.3s	remaining: 8.82s
161:	learn: 0.0064270	total: 20.5s	remaining: 8.73s
162:	learn: 0.0063759	total: 20.7s	remaining: 8.62s
163:	learn: 0.0063133	total: 20.9s	remaining: 8.52s
164:	learn: 0.0062234	total: 21s	remaining: 8.42s
165:	learn: 0.0061446	total: 21.2s	remaining: 8.31s
166:	learn: 0.0060893	total: 21.4s	remaining: 8.2s
167:	learn: 0.0060188	total: 21.5s	remaining: 8.08s
168:	learn: 0.0059638	total: 21.6s	remaining: 7.93s
169:	learn: 0.0058977	total: 21.7s	remaining: 7.8s
170:	learn: 0.0058639	total: 21.9s	remaining: 7.67s
171:	learn: 0.0058234	total: 21.9s	remaining: 7.53s
172:	learn: 0.0057874	total: 22s	remaining: 7.39s
173:	learn: 0.0057489	total: 22.1s	remaining: 7.25s
174:	learn: 0.0057187	total: 22.2s	remaining: 7.11s
175:	learn: 0.0056932	total: 22.3s	remaining: 6.97s
176:	learn: 0.0056305	total: 22.4s	remaining: 6.84s
177:	learn: 0.0055935	total: 22.5s	remaining: 6.7s
178:	learn: 0.0055723	total: 22.6s	remaining: 6.56s
179:	learn: 0.0055173	total: 22.7s	remaining: 6.43s
180:	learn: 0.0054824	total: 22.8s	remaining: 6.29s
181:	learn: 0.0054292	total: 22.9s	remaining: 6.16s
182:	learn: 0.0053805	total: 23s	remaining: 6.03s
183:	learn: 0.0053406	total: 23.1s	remaining: 5.89s
184:	learn: 0.0053070	total: 23.2s	remaining: 5.76s
185:	learn: 0.0052494	total: 23.3s	remaining: 5.63s
186:	learn: 0.0052227	total: 23.4s	remaining: 5.5s
187:	learn: 0.0051825	total: 23.4s	remaining: 5.36s
188:	learn: 0.0051470	total: 23.5s	remaining: 5.23s
189:	learn: 0.0051144	total: 23.6s	remaining: 5.1s
190:	learn: 0.0050532	total: 23.7s	remaining: 4.97s
191:	learn: 0.0050283	total: 23.9s	remaining: 4.84s
192:	learn: 0.0049988	total: 24s	remaining: 4.72s
193:	learn: 0.0049797	total: 24s	remaining: 4.58s
194:	learn: 0.0049555	total: 24.1s	remaining: 4.46s
195:	learn: 0.0049165	total: 24.2s	remaining: 4.33s
196:	learn: 0.0048846	total: 24.3s	remaining: 4.2s
197:	learn: 0.0048407	total: 24.4s	remaining: 4.07s
198:	learn: 0.0048301	total: 24.5s	remaining: 3.94s
199:	learn: 0.0047966	total: 24.6s	remaining: 3.81s
200:	learn: 0.0047678	total: 24.7s	remaining: 3.68s
201:	learn: 0.0047313	total: 24.8s	remaining: 3.56s
202:	learn: 0.0046689	total: 24.9s	remaining: 3.43s
203:	learn: 0.0046486	total: 25s	remaining: 3.31s
204:	learn: 0.0046292	total: 25.1s	remaining: 3.18s
205:	learn: 0.0046025	total: 25.2s	remaining: 3.05s
206:	learn: 0.0045535	total: 25.3s	remaining: 2.93s
207:	learn: 0.0045408	total: 25.3s	remaining: 2.8s
208:	learn: 0.0045190	total: 25.4s	remaining: 2.68s
209:	learn: 0.0044697	total: 25.6s	remaining: 2.56s
210:	learn: 0.0044375	total: 25.6s	remaining: 2.43s
211:	learn: 0.0043615	total: 25.7s	remaining: 2.31s
212:	learn: 0.0042953	total: 25.9s	remaining: 2.18s
213:	learn: 0.0042852	total: 25.9s	remaining: 2.06s
214:	learn: 0.0042509	total: 26s	remaining: 1.94s
215:	learn: 0.0042317	total: 26.1s	remaining: 1.81s
216:	learn: 0.0042054	total: 26.2s	remaining: 1.69s
217:	learn: 0.0041594	total: 26.3s	remaining: 1.57s
218:	learn: 0.0041447	total: 26.4s	remaining: 1.45s
219:	learn: 0.0041020	total: 26.5s	remaining: 1.32s
220:	learn: 0.0040903	total: 26.6s	remaining: 1.2s
221:	learn: 0.0040618	total: 26.7s	remaining: 1.08s
222:	learn: 0.0040430	total: 26.8s	remaining: 961ms
223:	learn: 0.0040151	total: 26.9s	remaining: 840ms
224:	learn: 0.0039897	total: 27s	remaining: 720ms
225:	learn: 0.0039495	total: 27.1s	remaining: 599ms
226:	learn: 0.0039263	total: 27.2s	remaining: 479ms
227:	learn: 0.0039145	total: 27.3s	remaining: 359ms
228:	learn: 0.0038790	total: 27.4s	remaining: 239ms
229:	learn: 0.0038633	total: 27.4s	remaining: 119ms
230:	learn: 0.0038214	total: 27.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.24
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.62
 - F1-Score_Train: 99.62
 - Precision_Test: 14.18
 - Recall_Test: 87.30
 - AUPRC_Test: 77.35
 - Accuracy_Test: 99.09
 - F1-Score_Test: 24.39
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 231
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5916181	total: 91.9ms	remaining: 21.1s
1:	learn: 0.5128143	total: 184ms	remaining: 21s
2:	learn: 0.4352493	total: 284ms	remaining: 21.6s
3:	learn: 0.3746933	total: 405ms	remaining: 23s
4:	learn: 0.3156969	total: 498ms	remaining: 22.5s
5:	learn: 0.2725593	total: 590ms	remaining: 22.1s
6:	learn: 0.2447964	total: 697ms	remaining: 22.3s
7:	learn: 0.2146872	total: 791ms	remaining: 22.1s
8:	learn: 0.1917930	total: 919ms	remaining: 22.7s
9:	learn: 0.1735757	total: 1.05s	remaining: 23.3s
10:	learn: 0.1578069	total: 1.19s	remaining: 23.9s
11:	learn: 0.1449443	total: 1.32s	remaining: 24s
12:	learn: 0.1336587	total: 1.49s	remaining: 25s
13:	learn: 0.1245141	total: 1.61s	remaining: 24.9s
14:	learn: 0.1165104	total: 1.77s	remaining: 25.6s
15:	learn: 0.1117514	total: 1.94s	remaining: 26.1s
16:	learn: 0.1057568	total: 2.19s	remaining: 27.6s
17:	learn: 0.1012074	total: 2.39s	remaining: 28.3s
18:	learn: 0.0956786	total: 2.57s	remaining: 28.7s
19:	learn: 0.0919746	total: 2.76s	remaining: 29.1s
20:	learn: 0.0887176	total: 2.94s	remaining: 29.4s
21:	learn: 0.0845710	total: 3.1s	remaining: 29.5s
22:	learn: 0.0815413	total: 3.27s	remaining: 29.6s
23:	learn: 0.0788915	total: 3.45s	remaining: 29.8s
24:	learn: 0.0740826	total: 3.64s	remaining: 30s
25:	learn: 0.0708363	total: 3.81s	remaining: 30.1s
26:	learn: 0.0679620	total: 3.97s	remaining: 30s
27:	learn: 0.0657972	total: 4.13s	remaining: 29.9s
28:	learn: 0.0632895	total: 4.31s	remaining: 30s
29:	learn: 0.0607668	total: 4.5s	remaining: 30.2s
30:	learn: 0.0594059	total: 4.67s	remaining: 30.1s
31:	learn: 0.0579376	total: 4.84s	remaining: 30.1s
32:	learn: 0.0567363	total: 5.01s	remaining: 30.1s
33:	learn: 0.0552491	total: 5.2s	remaining: 30.1s
34:	learn: 0.0536445	total: 5.37s	remaining: 30.1s
35:	learn: 0.0525107	total: 5.54s	remaining: 30s
36:	learn: 0.0510907	total: 5.71s	remaining: 30s
37:	learn: 0.0498275	total: 5.87s	remaining: 29.8s
38:	learn: 0.0485918	total: 6.02s	remaining: 29.6s
39:	learn: 0.0475143	total: 6.22s	remaining: 29.7s
40:	learn: 0.0463354	total: 6.39s	remaining: 29.6s
41:	learn: 0.0448047	total: 6.58s	remaining: 29.6s
42:	learn: 0.0436504	total: 6.76s	remaining: 29.6s
43:	learn: 0.0426424	total: 6.94s	remaining: 29.5s
44:	learn: 0.0417210	total: 7.12s	remaining: 29.4s
45:	learn: 0.0409169	total: 7.3s	remaining: 29.4s
46:	learn: 0.0399087	total: 7.46s	remaining: 29.2s
47:	learn: 0.0389327	total: 7.55s	remaining: 28.8s
48:	learn: 0.0377539	total: 7.64s	remaining: 28.4s
49:	learn: 0.0371252	total: 7.75s	remaining: 28s
50:	learn: 0.0366093	total: 7.86s	remaining: 27.8s
51:	learn: 0.0357721	total: 7.95s	remaining: 27.4s
52:	learn: 0.0352345	total: 8.04s	remaining: 27s
53:	learn: 0.0347052	total: 8.14s	remaining: 26.7s
54:	learn: 0.0343196	total: 8.23s	remaining: 26.3s
55:	learn: 0.0336827	total: 8.31s	remaining: 26s
56:	learn: 0.0330138	total: 8.42s	remaining: 25.7s
57:	learn: 0.0321679	total: 8.51s	remaining: 25.4s
58:	learn: 0.0316843	total: 8.61s	remaining: 25.1s
59:	learn: 0.0312014	total: 8.71s	remaining: 24.8s
60:	learn: 0.0307772	total: 8.81s	remaining: 24.6s
61:	learn: 0.0302314	total: 8.9s	remaining: 24.3s
62:	learn: 0.0297768	total: 9.01s	remaining: 24s
63:	learn: 0.0294243	total: 9.09s	remaining: 23.7s
64:	learn: 0.0289652	total: 9.18s	remaining: 23.4s
65:	learn: 0.0284307	total: 9.28s	remaining: 23.2s
66:	learn: 0.0279732	total: 9.37s	remaining: 22.9s
67:	learn: 0.0273735	total: 9.46s	remaining: 22.7s
68:	learn: 0.0271195	total: 9.56s	remaining: 22.4s
69:	learn: 0.0265938	total: 9.66s	remaining: 22.2s
70:	learn: 0.0263078	total: 9.74s	remaining: 22s
71:	learn: 0.0261205	total: 9.86s	remaining: 21.8s
72:	learn: 0.0256670	total: 9.95s	remaining: 21.5s
73:	learn: 0.0253285	total: 10s	remaining: 21.3s
74:	learn: 0.0249540	total: 10.1s	remaining: 21.1s
75:	learn: 0.0245631	total: 10.2s	remaining: 20.9s
76:	learn: 0.0240907	total: 10.3s	remaining: 20.7s
77:	learn: 0.0237221	total: 10.4s	remaining: 20.5s
78:	learn: 0.0233970	total: 10.5s	remaining: 20.2s
79:	learn: 0.0230868	total: 10.6s	remaining: 20s
80:	learn: 0.0226512	total: 10.7s	remaining: 19.8s
81:	learn: 0.0224113	total: 10.8s	remaining: 19.6s
82:	learn: 0.0221105	total: 10.9s	remaining: 19.4s
83:	learn: 0.0217339	total: 11s	remaining: 19.3s
84:	learn: 0.0215493	total: 11.1s	remaining: 19.1s
85:	learn: 0.0212963	total: 11.2s	remaining: 18.9s
86:	learn: 0.0209121	total: 11.3s	remaining: 18.7s
87:	learn: 0.0207369	total: 11.4s	remaining: 18.5s
88:	learn: 0.0205255	total: 11.5s	remaining: 18.3s
89:	learn: 0.0203023	total: 11.6s	remaining: 18.1s
90:	learn: 0.0200490	total: 11.7s	remaining: 17.9s
91:	learn: 0.0196525	total: 11.8s	remaining: 17.8s
92:	learn: 0.0193940	total: 11.9s	remaining: 17.6s
93:	learn: 0.0191310	total: 12s	remaining: 17.4s
94:	learn: 0.0189179	total: 12.1s	remaining: 17.3s
95:	learn: 0.0186825	total: 12.2s	remaining: 17.1s
96:	learn: 0.0184811	total: 12.3s	remaining: 16.9s
97:	learn: 0.0183166	total: 12.3s	remaining: 16.8s
98:	learn: 0.0180442	total: 12.5s	remaining: 16.6s
99:	learn: 0.0178823	total: 12.5s	remaining: 16.4s
100:	learn: 0.0176332	total: 12.6s	remaining: 16.2s
101:	learn: 0.0174850	total: 12.7s	remaining: 16.1s
102:	learn: 0.0173216	total: 12.8s	remaining: 15.9s
103:	learn: 0.0171314	total: 12.9s	remaining: 15.8s
104:	learn: 0.0169407	total: 13s	remaining: 15.6s
105:	learn: 0.0167021	total: 13.1s	remaining: 15.5s
106:	learn: 0.0165776	total: 13.2s	remaining: 15.3s
107:	learn: 0.0163848	total: 13.3s	remaining: 15.2s
108:	learn: 0.0162017	total: 13.4s	remaining: 15s
109:	learn: 0.0160435	total: 13.5s	remaining: 14.8s
110:	learn: 0.0158489	total: 13.6s	remaining: 14.7s
111:	learn: 0.0156532	total: 13.7s	remaining: 14.5s
112:	learn: 0.0155975	total: 13.8s	remaining: 14.4s
113:	learn: 0.0153327	total: 13.9s	remaining: 14.2s
114:	learn: 0.0151742	total: 14s	remaining: 14.1s
115:	learn: 0.0150521	total: 14.1s	remaining: 13.9s
116:	learn: 0.0148507	total: 14.2s	remaining: 13.8s
117:	learn: 0.0147099	total: 14.3s	remaining: 13.7s
118:	learn: 0.0145790	total: 14.3s	remaining: 13.5s
119:	learn: 0.0144141	total: 14.5s	remaining: 13.4s
120:	learn: 0.0142087	total: 14.5s	remaining: 13.2s
121:	learn: 0.0141146	total: 14.6s	remaining: 13.1s
122:	learn: 0.0139070	total: 14.7s	remaining: 12.9s
123:	learn: 0.0138087	total: 14.8s	remaining: 12.8s
124:	learn: 0.0136878	total: 14.9s	remaining: 12.6s
125:	learn: 0.0135440	total: 15s	remaining: 12.5s
126:	learn: 0.0133168	total: 15.1s	remaining: 12.4s
127:	learn: 0.0131474	total: 15.2s	remaining: 12.2s
128:	learn: 0.0130066	total: 15.3s	remaining: 12.1s
129:	learn: 0.0128328	total: 15.4s	remaining: 12s
130:	learn: 0.0127184	total: 15.5s	remaining: 11.8s
131:	learn: 0.0125535	total: 15.6s	remaining: 11.7s
132:	learn: 0.0123804	total: 15.7s	remaining: 11.6s
133:	learn: 0.0122736	total: 15.8s	remaining: 11.4s
134:	learn: 0.0121382	total: 15.9s	remaining: 11.3s
135:	learn: 0.0120575	total: 16s	remaining: 11.1s
136:	learn: 0.0119252	total: 16.1s	remaining: 11s
137:	learn: 0.0118272	total: 16.2s	remaining: 10.9s
138:	learn: 0.0116778	total: 16.3s	remaining: 10.8s
139:	learn: 0.0115538	total: 16.3s	remaining: 10.6s
140:	learn: 0.0114134	total: 16.5s	remaining: 10.5s
141:	learn: 0.0112919	total: 16.5s	remaining: 10.4s
142:	learn: 0.0111843	total: 16.6s	remaining: 10.2s
143:	learn: 0.0111222	total: 16.7s	remaining: 10.1s
144:	learn: 0.0110030	total: 16.8s	remaining: 9.97s
145:	learn: 0.0108826	total: 16.9s	remaining: 9.84s
146:	learn: 0.0107374	total: 17s	remaining: 9.74s
147:	learn: 0.0106469	total: 17.1s	remaining: 9.61s
148:	learn: 0.0105998	total: 17.2s	remaining: 9.48s
149:	learn: 0.0105623	total: 17.3s	remaining: 9.35s
150:	learn: 0.0104833	total: 17.4s	remaining: 9.22s
151:	learn: 0.0104204	total: 17.5s	remaining: 9.12s
152:	learn: 0.0102602	total: 17.7s	remaining: 9.01s
153:	learn: 0.0101837	total: 17.8s	remaining: 8.91s
154:	learn: 0.0100898	total: 17.9s	remaining: 8.8s
155:	learn: 0.0099529	total: 18.1s	remaining: 8.71s
156:	learn: 0.0098416	total: 18.3s	remaining: 8.62s
157:	learn: 0.0097791	total: 18.5s	remaining: 8.53s
158:	learn: 0.0097198	total: 18.6s	remaining: 8.44s
159:	learn: 0.0096618	total: 18.8s	remaining: 8.34s
160:	learn: 0.0095460	total: 19s	remaining: 8.25s
161:	learn: 0.0094568	total: 19.2s	remaining: 8.16s
162:	learn: 0.0093966	total: 19.3s	remaining: 8.06s
163:	learn: 0.0093306	total: 19.5s	remaining: 7.96s
164:	learn: 0.0092774	total: 19.7s	remaining: 7.86s
165:	learn: 0.0092010	total: 19.8s	remaining: 7.76s
166:	learn: 0.0091386	total: 20s	remaining: 7.66s
167:	learn: 0.0090547	total: 20.2s	remaining: 7.58s
168:	learn: 0.0089788	total: 20.4s	remaining: 7.48s
169:	learn: 0.0089174	total: 20.6s	remaining: 7.38s
170:	learn: 0.0088650	total: 20.7s	remaining: 7.28s
171:	learn: 0.0088410	total: 20.9s	remaining: 7.17s
172:	learn: 0.0087973	total: 21.1s	remaining: 7.06s
173:	learn: 0.0087078	total: 21.3s	remaining: 6.96s
174:	learn: 0.0086196	total: 21.4s	remaining: 6.85s
175:	learn: 0.0085350	total: 21.6s	remaining: 6.74s
176:	learn: 0.0084679	total: 21.8s	remaining: 6.64s
177:	learn: 0.0083910	total: 21.9s	remaining: 6.53s
178:	learn: 0.0082952	total: 22.1s	remaining: 6.42s
179:	learn: 0.0082027	total: 22.3s	remaining: 6.31s
180:	learn: 0.0080992	total: 22.4s	remaining: 6.2s
181:	learn: 0.0080739	total: 22.6s	remaining: 6.09s
182:	learn: 0.0080443	total: 22.8s	remaining: 5.97s
183:	learn: 0.0079837	total: 23s	remaining: 5.87s
184:	learn: 0.0079488	total: 23.1s	remaining: 5.74s
185:	learn: 0.0079110	total: 23.2s	remaining: 5.61s
186:	learn: 0.0078399	total: 23.3s	remaining: 5.48s
187:	learn: 0.0077748	total: 23.4s	remaining: 5.35s
188:	learn: 0.0077289	total: 23.5s	remaining: 5.22s
189:	learn: 0.0076759	total: 23.6s	remaining: 5.09s
190:	learn: 0.0076113	total: 23.7s	remaining: 4.96s
191:	learn: 0.0075578	total: 23.8s	remaining: 4.83s
192:	learn: 0.0074987	total: 23.8s	remaining: 4.7s
193:	learn: 0.0074558	total: 24s	remaining: 4.57s
194:	learn: 0.0073606	total: 24.1s	remaining: 4.44s
195:	learn: 0.0072791	total: 24.1s	remaining: 4.31s
196:	learn: 0.0072548	total: 24.2s	remaining: 4.18s
197:	learn: 0.0071954	total: 24.4s	remaining: 4.06s
198:	learn: 0.0071268	total: 24.4s	remaining: 3.93s
199:	learn: 0.0070635	total: 24.5s	remaining: 3.8s
200:	learn: 0.0070055	total: 24.6s	remaining: 3.68s
201:	learn: 0.0069711	total: 24.7s	remaining: 3.55s
202:	learn: 0.0069318	total: 24.8s	remaining: 3.42s
203:	learn: 0.0068948	total: 24.9s	remaining: 3.29s
204:	learn: 0.0068173	total: 25s	remaining: 3.17s
205:	learn: 0.0067627	total: 25.1s	remaining: 3.04s
206:	learn: 0.0067432	total: 25.2s	remaining: 2.92s
207:	learn: 0.0067048	total: 25.3s	remaining: 2.79s
208:	learn: 0.0066546	total: 25.4s	remaining: 2.67s
209:	learn: 0.0066100	total: 25.5s	remaining: 2.55s
210:	learn: 0.0065523	total: 25.6s	remaining: 2.42s
211:	learn: 0.0065337	total: 25.6s	remaining: 2.3s
212:	learn: 0.0064669	total: 25.7s	remaining: 2.17s
213:	learn: 0.0064255	total: 25.8s	remaining: 2.05s
214:	learn: 0.0064059	total: 25.9s	remaining: 1.93s
215:	learn: 0.0063805	total: 26s	remaining: 1.81s
216:	learn: 0.0063636	total: 26.1s	remaining: 1.68s
217:	learn: 0.0063254	total: 26.2s	remaining: 1.56s
218:	learn: 0.0062684	total: 26.3s	remaining: 1.44s
219:	learn: 0.0062415	total: 26.4s	remaining: 1.32s
220:	learn: 0.0062087	total: 26.5s	remaining: 1.2s
221:	learn: 0.0061578	total: 26.6s	remaining: 1.08s
222:	learn: 0.0060957	total: 26.6s	remaining: 956ms
223:	learn: 0.0060607	total: 26.7s	remaining: 836ms
224:	learn: 0.0060306	total: 26.8s	remaining: 716ms
225:	learn: 0.0059945	total: 26.9s	remaining: 596ms
226:	learn: 0.0059406	total: 27.1s	remaining: 477ms
227:	learn: 0.0059105	total: 27.1s	remaining: 357ms
228:	learn: 0.0058845	total: 27.2s	remaining: 238ms
229:	learn: 0.0058706	total: 27.3s	remaining: 119ms
230:	learn: 0.0058531	total: 27.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.97
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.48
 - F1-Score_Train: 99.48
 - Precision_Test: 12.15
 - Recall_Test: 89.68
 - AUPRC_Test: 71.65
 - Accuracy_Test: 98.89
 - F1-Score_Test: 21.40
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 231
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5962008	total: 80.9ms	remaining: 18.6s
1:	learn: 0.4911106	total: 172ms	remaining: 19.7s
2:	learn: 0.4141189	total: 269ms	remaining: 20.4s
3:	learn: 0.3526550	total: 373ms	remaining: 21.2s
4:	learn: 0.3059866	total: 458ms	remaining: 20.7s
5:	learn: 0.2635106	total: 547ms	remaining: 20.5s
6:	learn: 0.2344106	total: 645ms	remaining: 20.7s
7:	learn: 0.2082897	total: 733ms	remaining: 20.4s
8:	learn: 0.1812754	total: 829ms	remaining: 20.4s
9:	learn: 0.1619530	total: 933ms	remaining: 20.6s
10:	learn: 0.1416047	total: 1.04s	remaining: 20.9s
11:	learn: 0.1280135	total: 1.12s	remaining: 20.5s
12:	learn: 0.1186202	total: 1.23s	remaining: 20.6s
13:	learn: 0.1106266	total: 1.32s	remaining: 20.4s
14:	learn: 0.1005918	total: 1.41s	remaining: 20.3s
15:	learn: 0.0931507	total: 1.52s	remaining: 20.4s
16:	learn: 0.0860718	total: 1.61s	remaining: 20.2s
17:	learn: 0.0822222	total: 1.69s	remaining: 19.9s
18:	learn: 0.0770503	total: 1.79s	remaining: 20s
19:	learn: 0.0735127	total: 1.88s	remaining: 19.9s
20:	learn: 0.0710736	total: 1.97s	remaining: 19.7s
21:	learn: 0.0679752	total: 2.1s	remaining: 19.9s
22:	learn: 0.0642245	total: 2.19s	remaining: 19.8s
23:	learn: 0.0613778	total: 2.28s	remaining: 19.7s
24:	learn: 0.0591022	total: 2.38s	remaining: 19.6s
25:	learn: 0.0564484	total: 2.47s	remaining: 19.5s
26:	learn: 0.0546917	total: 2.56s	remaining: 19.3s
27:	learn: 0.0529847	total: 2.66s	remaining: 19.3s
28:	learn: 0.0513966	total: 2.75s	remaining: 19.2s
29:	learn: 0.0496077	total: 2.84s	remaining: 19s
30:	learn: 0.0485190	total: 2.95s	remaining: 19s
31:	learn: 0.0470158	total: 3.03s	remaining: 18.9s
32:	learn: 0.0458412	total: 3.13s	remaining: 18.8s
33:	learn: 0.0442436	total: 3.24s	remaining: 18.8s
34:	learn: 0.0432381	total: 3.34s	remaining: 18.7s
35:	learn: 0.0419954	total: 3.42s	remaining: 18.6s
36:	learn: 0.0410402	total: 3.54s	remaining: 18.6s
37:	learn: 0.0399099	total: 3.69s	remaining: 18.7s
38:	learn: 0.0388596	total: 3.85s	remaining: 19s
39:	learn: 0.0377822	total: 4s	remaining: 19.1s
40:	learn: 0.0369537	total: 4.21s	remaining: 19.5s
41:	learn: 0.0360245	total: 4.38s	remaining: 19.7s
42:	learn: 0.0350348	total: 4.55s	remaining: 19.9s
43:	learn: 0.0341179	total: 4.74s	remaining: 20.1s
44:	learn: 0.0333245	total: 4.94s	remaining: 20.4s
45:	learn: 0.0324960	total: 5.11s	remaining: 20.6s
46:	learn: 0.0317561	total: 5.29s	remaining: 20.7s
47:	learn: 0.0310354	total: 5.46s	remaining: 20.8s
48:	learn: 0.0303390	total: 5.64s	remaining: 20.9s
49:	learn: 0.0298323	total: 5.8s	remaining: 21s
50:	learn: 0.0292785	total: 5.99s	remaining: 21.2s
51:	learn: 0.0287568	total: 6.17s	remaining: 21.2s
52:	learn: 0.0281538	total: 6.37s	remaining: 21.4s
53:	learn: 0.0277533	total: 6.56s	remaining: 21.5s
54:	learn: 0.0273270	total: 6.73s	remaining: 21.5s
55:	learn: 0.0267353	total: 6.91s	remaining: 21.6s
56:	learn: 0.0264194	total: 7.08s	remaining: 21.6s
57:	learn: 0.0260294	total: 7.26s	remaining: 21.7s
58:	learn: 0.0255607	total: 7.44s	remaining: 21.7s
59:	learn: 0.0251239	total: 7.59s	remaining: 21.6s
60:	learn: 0.0247028	total: 7.77s	remaining: 21.7s
61:	learn: 0.0242709	total: 7.94s	remaining: 21.6s
62:	learn: 0.0238678	total: 8.11s	remaining: 21.6s
63:	learn: 0.0234621	total: 8.28s	remaining: 21.6s
64:	learn: 0.0231763	total: 8.46s	remaining: 21.6s
65:	learn: 0.0229545	total: 8.62s	remaining: 21.5s
66:	learn: 0.0226297	total: 8.78s	remaining: 21.5s
67:	learn: 0.0223824	total: 8.89s	remaining: 21.3s
68:	learn: 0.0219254	total: 8.99s	remaining: 21.1s
69:	learn: 0.0216387	total: 9.09s	remaining: 20.9s
70:	learn: 0.0213262	total: 9.18s	remaining: 20.7s
71:	learn: 0.0210228	total: 9.32s	remaining: 20.6s
72:	learn: 0.0207393	total: 9.41s	remaining: 20.4s
73:	learn: 0.0204044	total: 9.52s	remaining: 20.2s
74:	learn: 0.0201549	total: 9.62s	remaining: 20s
75:	learn: 0.0199423	total: 9.71s	remaining: 19.8s
76:	learn: 0.0196573	total: 9.79s	remaining: 19.6s
77:	learn: 0.0194421	total: 9.9s	remaining: 19.4s
78:	learn: 0.0192164	total: 9.99s	remaining: 19.2s
79:	learn: 0.0190271	total: 10.1s	remaining: 19s
80:	learn: 0.0188241	total: 10.2s	remaining: 18.8s
81:	learn: 0.0184811	total: 10.3s	remaining: 18.6s
82:	learn: 0.0183386	total: 10.3s	remaining: 18.5s
83:	learn: 0.0180717	total: 10.5s	remaining: 18.3s
84:	learn: 0.0177791	total: 10.6s	remaining: 18.1s
85:	learn: 0.0174648	total: 10.7s	remaining: 18s
86:	learn: 0.0173237	total: 10.8s	remaining: 17.8s
87:	learn: 0.0170143	total: 10.9s	remaining: 17.6s
88:	learn: 0.0168048	total: 10.9s	remaining: 17.5s
89:	learn: 0.0164813	total: 11.1s	remaining: 17.3s
90:	learn: 0.0163700	total: 11.1s	remaining: 17.1s
91:	learn: 0.0162540	total: 11.2s	remaining: 16.9s
92:	learn: 0.0161035	total: 11.3s	remaining: 16.8s
93:	learn: 0.0159167	total: 11.4s	remaining: 16.6s
94:	learn: 0.0158096	total: 11.5s	remaining: 16.4s
95:	learn: 0.0156516	total: 11.6s	remaining: 16.3s
96:	learn: 0.0154415	total: 11.7s	remaining: 16.1s
97:	learn: 0.0153174	total: 11.8s	remaining: 16s
98:	learn: 0.0151116	total: 11.9s	remaining: 15.8s
99:	learn: 0.0148736	total: 12s	remaining: 15.7s
100:	learn: 0.0146126	total: 12.1s	remaining: 15.5s
101:	learn: 0.0144220	total: 12.2s	remaining: 15.4s
102:	learn: 0.0142996	total: 12.3s	remaining: 15.2s
103:	learn: 0.0141643	total: 12.4s	remaining: 15.1s
104:	learn: 0.0139854	total: 12.5s	remaining: 15s
105:	learn: 0.0138708	total: 12.6s	remaining: 14.8s
106:	learn: 0.0136621	total: 12.7s	remaining: 14.7s
107:	learn: 0.0135050	total: 12.8s	remaining: 14.6s
108:	learn: 0.0133555	total: 12.9s	remaining: 14.4s
109:	learn: 0.0131609	total: 13s	remaining: 14.3s
110:	learn: 0.0130892	total: 13.1s	remaining: 14.1s
111:	learn: 0.0129431	total: 13.2s	remaining: 14s
112:	learn: 0.0127559	total: 13.3s	remaining: 13.8s
113:	learn: 0.0125399	total: 13.4s	remaining: 13.7s
114:	learn: 0.0124520	total: 13.5s	remaining: 13.6s
115:	learn: 0.0123150	total: 13.6s	remaining: 13.4s
116:	learn: 0.0121760	total: 13.7s	remaining: 13.3s
117:	learn: 0.0120097	total: 13.8s	remaining: 13.2s
118:	learn: 0.0118546	total: 13.8s	remaining: 13s
119:	learn: 0.0117257	total: 14s	remaining: 12.9s
120:	learn: 0.0116685	total: 14s	remaining: 12.8s
121:	learn: 0.0115127	total: 14.1s	remaining: 12.6s
122:	learn: 0.0114298	total: 14.2s	remaining: 12.5s
123:	learn: 0.0113774	total: 14.3s	remaining: 12.3s
124:	learn: 0.0112596	total: 14.4s	remaining: 12.2s
125:	learn: 0.0110871	total: 14.5s	remaining: 12.1s
126:	learn: 0.0109520	total: 14.6s	remaining: 12s
127:	learn: 0.0108336	total: 14.7s	remaining: 11.8s
128:	learn: 0.0107555	total: 14.8s	remaining: 11.7s
129:	learn: 0.0106614	total: 14.9s	remaining: 11.6s
130:	learn: 0.0105536	total: 15s	remaining: 11.4s
131:	learn: 0.0104750	total: 15.1s	remaining: 11.3s
132:	learn: 0.0103204	total: 15.2s	remaining: 11.2s
133:	learn: 0.0101967	total: 15.3s	remaining: 11.1s
134:	learn: 0.0101030	total: 15.4s	remaining: 10.9s
135:	learn: 0.0099918	total: 15.5s	remaining: 10.8s
136:	learn: 0.0099004	total: 15.6s	remaining: 10.7s
137:	learn: 0.0097784	total: 15.7s	remaining: 10.6s
138:	learn: 0.0097376	total: 15.8s	remaining: 10.4s
139:	learn: 0.0096579	total: 15.9s	remaining: 10.3s
140:	learn: 0.0095444	total: 16s	remaining: 10.2s
141:	learn: 0.0094921	total: 16.1s	remaining: 10.1s
142:	learn: 0.0094044	total: 16.1s	remaining: 9.93s
143:	learn: 0.0093756	total: 16.2s	remaining: 9.81s
144:	learn: 0.0093023	total: 16.3s	remaining: 9.69s
145:	learn: 0.0092712	total: 16.4s	remaining: 9.56s
146:	learn: 0.0092062	total: 16.5s	remaining: 9.45s
147:	learn: 0.0091774	total: 16.6s	remaining: 9.32s
148:	learn: 0.0091014	total: 16.7s	remaining: 9.2s
149:	learn: 0.0090195	total: 16.8s	remaining: 9.09s
150:	learn: 0.0089481	total: 16.9s	remaining: 8.96s
151:	learn: 0.0088460	total: 17s	remaining: 8.84s
152:	learn: 0.0087747	total: 17.1s	remaining: 8.72s
153:	learn: 0.0086750	total: 17.2s	remaining: 8.6s
154:	learn: 0.0085854	total: 17.3s	remaining: 8.48s
155:	learn: 0.0085401	total: 17.4s	remaining: 8.36s
156:	learn: 0.0084639	total: 17.5s	remaining: 8.24s
157:	learn: 0.0084089	total: 17.6s	remaining: 8.11s
158:	learn: 0.0083414	total: 17.7s	remaining: 8s
159:	learn: 0.0082635	total: 17.8s	remaining: 7.88s
160:	learn: 0.0081788	total: 17.8s	remaining: 7.76s
161:	learn: 0.0081156	total: 18s	remaining: 7.65s
162:	learn: 0.0080651	total: 18s	remaining: 7.52s
163:	learn: 0.0079541	total: 18.1s	remaining: 7.41s
164:	learn: 0.0078832	total: 18.2s	remaining: 7.29s
165:	learn: 0.0078273	total: 18.4s	remaining: 7.19s
166:	learn: 0.0077734	total: 18.4s	remaining: 7.07s
167:	learn: 0.0077004	total: 18.5s	remaining: 6.95s
168:	learn: 0.0076327	total: 18.6s	remaining: 6.83s
169:	learn: 0.0075861	total: 18.7s	remaining: 6.72s
170:	learn: 0.0075508	total: 18.8s	remaining: 6.61s
171:	learn: 0.0075287	total: 19s	remaining: 6.51s
172:	learn: 0.0074262	total: 19.1s	remaining: 6.42s
173:	learn: 0.0073691	total: 19.3s	remaining: 6.33s
174:	learn: 0.0073012	total: 19.5s	remaining: 6.24s
175:	learn: 0.0072179	total: 19.7s	remaining: 6.15s
176:	learn: 0.0071348	total: 19.9s	remaining: 6.06s
177:	learn: 0.0070958	total: 20s	remaining: 5.97s
178:	learn: 0.0070251	total: 20.2s	remaining: 5.87s
179:	learn: 0.0069819	total: 20.4s	remaining: 5.78s
180:	learn: 0.0069269	total: 20.5s	remaining: 5.67s
181:	learn: 0.0068666	total: 20.7s	remaining: 5.57s
182:	learn: 0.0067982	total: 20.9s	remaining: 5.47s
183:	learn: 0.0067340	total: 21s	remaining: 5.38s
184:	learn: 0.0066843	total: 21.2s	remaining: 5.28s
185:	learn: 0.0066477	total: 21.4s	remaining: 5.18s
186:	learn: 0.0066163	total: 21.5s	remaining: 5.07s
187:	learn: 0.0065963	total: 21.7s	remaining: 4.97s
188:	learn: 0.0065480	total: 21.9s	remaining: 4.87s
189:	learn: 0.0064857	total: 22.1s	remaining: 4.77s
190:	learn: 0.0064304	total: 22.3s	remaining: 4.66s
191:	learn: 0.0063878	total: 22.5s	remaining: 4.56s
192:	learn: 0.0063527	total: 22.6s	remaining: 4.45s
193:	learn: 0.0063148	total: 22.8s	remaining: 4.35s
194:	learn: 0.0062810	total: 23s	remaining: 4.24s
195:	learn: 0.0062391	total: 23.1s	remaining: 4.13s
196:	learn: 0.0062058	total: 23.3s	remaining: 4.02s
197:	learn: 0.0061460	total: 23.4s	remaining: 3.9s
198:	learn: 0.0060822	total: 23.6s	remaining: 3.79s
199:	learn: 0.0060399	total: 23.8s	remaining: 3.69s
200:	learn: 0.0059826	total: 23.9s	remaining: 3.57s
201:	learn: 0.0059641	total: 24s	remaining: 3.45s
202:	learn: 0.0059192	total: 24.1s	remaining: 3.33s
203:	learn: 0.0058715	total: 24.2s	remaining: 3.21s
204:	learn: 0.0058145	total: 24.4s	remaining: 3.09s
205:	learn: 0.0057849	total: 24.4s	remaining: 2.96s
206:	learn: 0.0057489	total: 24.5s	remaining: 2.84s
207:	learn: 0.0056856	total: 24.6s	remaining: 2.72s
208:	learn: 0.0056284	total: 24.7s	remaining: 2.6s
209:	learn: 0.0055938	total: 24.8s	remaining: 2.48s
210:	learn: 0.0055510	total: 24.9s	remaining: 2.36s
211:	learn: 0.0055241	total: 25s	remaining: 2.24s
212:	learn: 0.0054867	total: 25.1s	remaining: 2.12s
213:	learn: 0.0054605	total: 25.2s	remaining: 2s
214:	learn: 0.0054389	total: 25.3s	remaining: 1.88s
215:	learn: 0.0054167	total: 25.4s	remaining: 1.76s
216:	learn: 0.0053711	total: 25.5s	remaining: 1.64s
217:	learn: 0.0053370	total: 25.6s	remaining: 1.52s
218:	learn: 0.0053219	total: 25.6s	remaining: 1.4s
219:	learn: 0.0053058	total: 25.7s	remaining: 1.29s
220:	learn: 0.0052550	total: 25.8s	remaining: 1.17s
221:	learn: 0.0052286	total: 25.9s	remaining: 1.05s
222:	learn: 0.0051895	total: 26s	remaining: 933ms
223:	learn: 0.0051354	total: 26.1s	remaining: 815ms
224:	learn: 0.0050779	total: 26.2s	remaining: 698ms
225:	learn: 0.0050629	total: 26.3s	remaining: 582ms
226:	learn: 0.0050057	total: 26.4s	remaining: 465ms
227:	learn: 0.0049875	total: 26.5s	remaining: 348ms
228:	learn: 0.0049740	total: 26.6s	remaining: 232ms
229:	learn: 0.0049439	total: 26.7s	remaining: 116ms
230:	learn: 0.0049039	total: 26.8s	remaining: 0us
[I 2024-12-19 14:39:33,117] Trial 26 finished with value: 74.38254646492007 and parameters: {'learning_rate': 0.038345268912684985, 'max_depth': 6, 'n_estimators': 231, 'scale_pos_weight': 10.485025796706932}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.99
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.49
 - F1-Score_Train: 99.49
 - Precision_Test: 12.03
 - Recall_Test: 88.10
 - AUPRC_Test: 74.14
 - Accuracy_Test: 98.90
 - F1-Score_Test: 21.16
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 231
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 10.49
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 74.3825

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5137427	total: 81.7ms	remaining: 10.2s
1:	learn: 0.3869694	total: 161ms	remaining: 9.96s
2:	learn: 0.2891022	total: 239ms	remaining: 9.82s
3:	learn: 0.2011878	total: 358ms	remaining: 10.9s
4:	learn: 0.1769189	total: 432ms	remaining: 10.4s
5:	learn: 0.1392676	total: 512ms	remaining: 10.2s
6:	learn: 0.1160256	total: 610ms	remaining: 10.4s
7:	learn: 0.0971263	total: 692ms	remaining: 10.2s
8:	learn: 0.0886974	total: 771ms	remaining: 10s
9:	learn: 0.0806369	total: 862ms	remaining: 10s
10:	learn: 0.0749687	total: 939ms	remaining: 9.82s
11:	learn: 0.0668115	total: 1.01s	remaining: 9.63s
12:	learn: 0.0612693	total: 1.11s	remaining: 9.7s
13:	learn: 0.0543814	total: 1.2s	remaining: 9.6s
14:	learn: 0.0494505	total: 1.29s	remaining: 9.56s
15:	learn: 0.0472526	total: 1.42s	remaining: 9.74s
16:	learn: 0.0442642	total: 1.5s	remaining: 9.64s
17:	learn: 0.0420002	total: 1.58s	remaining: 9.46s
18:	learn: 0.0397520	total: 1.67s	remaining: 9.41s
19:	learn: 0.0376884	total: 1.75s	remaining: 9.27s
20:	learn: 0.0358057	total: 1.83s	remaining: 9.18s
21:	learn: 0.0345780	total: 1.93s	remaining: 9.1s
22:	learn: 0.0330683	total: 2s	remaining: 8.96s
23:	learn: 0.0319680	total: 2.08s	remaining: 8.84s
24:	learn: 0.0308537	total: 2.17s	remaining: 8.78s
25:	learn: 0.0300278	total: 2.25s	remaining: 8.67s
26:	learn: 0.0291682	total: 2.34s	remaining: 8.58s
27:	learn: 0.0282570	total: 2.45s	remaining: 8.57s
28:	learn: 0.0274439	total: 2.52s	remaining: 8.44s
29:	learn: 0.0267877	total: 2.6s	remaining: 8.31s
30:	learn: 0.0261310	total: 2.68s	remaining: 8.22s
31:	learn: 0.0256109	total: 2.75s	remaining: 8.09s
32:	learn: 0.0248388	total: 2.83s	remaining: 7.99s
33:	learn: 0.0240283	total: 2.93s	remaining: 7.93s
34:	learn: 0.0233883	total: 3s	remaining: 7.81s
35:	learn: 0.0229144	total: 3.08s	remaining: 7.7s
36:	learn: 0.0224885	total: 3.17s	remaining: 7.63s
37:	learn: 0.0218754	total: 3.26s	remaining: 7.54s
38:	learn: 0.0212517	total: 3.35s	remaining: 7.47s
39:	learn: 0.0208857	total: 3.44s	remaining: 7.38s
40:	learn: 0.0202862	total: 3.53s	remaining: 7.31s
41:	learn: 0.0199674	total: 3.6s	remaining: 7.21s
42:	learn: 0.0193238	total: 3.7s	remaining: 7.14s
43:	learn: 0.0189986	total: 3.77s	remaining: 7.03s
44:	learn: 0.0186725	total: 3.85s	remaining: 6.93s
45:	learn: 0.0183656	total: 3.95s	remaining: 6.87s
46:	learn: 0.0178989	total: 4.02s	remaining: 6.76s
47:	learn: 0.0173885	total: 4.1s	remaining: 6.67s
48:	learn: 0.0171402	total: 4.19s	remaining: 6.59s
49:	learn: 0.0168714	total: 4.28s	remaining: 6.5s
50:	learn: 0.0164698	total: 4.36s	remaining: 6.42s
51:	learn: 0.0161518	total: 4.5s	remaining: 6.4s
52:	learn: 0.0157470	total: 4.58s	remaining: 6.31s
53:	learn: 0.0153799	total: 4.66s	remaining: 6.21s
54:	learn: 0.0152201	total: 4.75s	remaining: 6.13s
55:	learn: 0.0149175	total: 4.83s	remaining: 6.04s
56:	learn: 0.0146737	total: 4.91s	remaining: 5.94s
57:	learn: 0.0144208	total: 5.04s	remaining: 5.91s
58:	learn: 0.0141741	total: 5.16s	remaining: 5.86s
59:	learn: 0.0138994	total: 5.32s	remaining: 5.85s
60:	learn: 0.0137137	total: 5.47s	remaining: 5.83s
61:	learn: 0.0134414	total: 5.59s	remaining: 5.78s
62:	learn: 0.0132292	total: 5.74s	remaining: 5.74s
63:	learn: 0.0130034	total: 5.88s	remaining: 5.69s
64:	learn: 0.0127753	total: 6.03s	remaining: 5.66s
65:	learn: 0.0125952	total: 6.2s	remaining: 5.64s
66:	learn: 0.0123929	total: 6.34s	remaining: 5.58s
67:	learn: 0.0120247	total: 6.52s	remaining: 5.56s
68:	learn: 0.0118240	total: 6.66s	remaining: 5.5s
69:	learn: 0.0116573	total: 6.83s	remaining: 5.46s
70:	learn: 0.0114746	total: 6.97s	remaining: 5.4s
71:	learn: 0.0112979	total: 7.14s	remaining: 5.36s
72:	learn: 0.0110881	total: 7.3s	remaining: 5.3s
73:	learn: 0.0109298	total: 7.45s	remaining: 5.23s
74:	learn: 0.0106413	total: 7.6s	remaining: 5.17s
75:	learn: 0.0104937	total: 7.77s	remaining: 5.11s
76:	learn: 0.0102355	total: 7.93s	remaining: 5.05s
77:	learn: 0.0100924	total: 8.1s	remaining: 4.98s
78:	learn: 0.0099325	total: 8.25s	remaining: 4.91s
79:	learn: 0.0098165	total: 8.41s	remaining: 4.83s
80:	learn: 0.0096220	total: 8.55s	remaining: 4.75s
81:	learn: 0.0094937	total: 8.7s	remaining: 4.67s
82:	learn: 0.0093116	total: 8.86s	remaining: 4.59s
83:	learn: 0.0092168	total: 9.01s	remaining: 4.5s
84:	learn: 0.0091056	total: 9.16s	remaining: 4.42s
85:	learn: 0.0089834	total: 9.34s	remaining: 4.35s
86:	learn: 0.0088879	total: 9.47s	remaining: 4.25s
87:	learn: 0.0087529	total: 9.63s	remaining: 4.16s
88:	learn: 0.0086210	total: 9.79s	remaining: 4.07s
89:	learn: 0.0085292	total: 9.94s	remaining: 3.98s
90:	learn: 0.0083630	total: 10.1s	remaining: 3.88s
91:	learn: 0.0082839	total: 10.3s	remaining: 3.79s
92:	learn: 0.0082103	total: 10.4s	remaining: 3.68s
93:	learn: 0.0080885	total: 10.5s	remaining: 3.56s
94:	learn: 0.0079179	total: 10.6s	remaining: 3.44s
95:	learn: 0.0077782	total: 10.6s	remaining: 3.32s
96:	learn: 0.0076820	total: 10.7s	remaining: 3.2s
97:	learn: 0.0075855	total: 10.8s	remaining: 3.09s
98:	learn: 0.0074773	total: 10.9s	remaining: 2.97s
99:	learn: 0.0073431	total: 11s	remaining: 2.85s
100:	learn: 0.0072339	total: 11.1s	remaining: 2.74s
101:	learn: 0.0070992	total: 11.1s	remaining: 2.62s
102:	learn: 0.0070253	total: 11.2s	remaining: 2.51s
103:	learn: 0.0069310	total: 11.3s	remaining: 2.39s
104:	learn: 0.0068084	total: 11.4s	remaining: 2.28s
105:	learn: 0.0067341	total: 11.5s	remaining: 2.17s
106:	learn: 0.0066961	total: 11.6s	remaining: 2.06s
107:	learn: 0.0066438	total: 11.7s	remaining: 1.94s
108:	learn: 0.0065748	total: 11.7s	remaining: 1.83s
109:	learn: 0.0065381	total: 11.8s	remaining: 1.72s
110:	learn: 0.0064712	total: 11.9s	remaining: 1.61s
111:	learn: 0.0063601	total: 12s	remaining: 1.5s
112:	learn: 0.0062486	total: 12.1s	remaining: 1.39s
113:	learn: 0.0061859	total: 12.2s	remaining: 1.28s
114:	learn: 0.0061567	total: 12.2s	remaining: 1.17s
115:	learn: 0.0060457	total: 12.3s	remaining: 1.06s
116:	learn: 0.0059826	total: 12.4s	remaining: 955ms
117:	learn: 0.0058995	total: 12.5s	remaining: 848ms
118:	learn: 0.0058695	total: 12.6s	remaining: 741ms
119:	learn: 0.0058293	total: 12.7s	remaining: 633ms
120:	learn: 0.0057772	total: 12.7s	remaining: 526ms
121:	learn: 0.0056881	total: 12.8s	remaining: 421ms
122:	learn: 0.0056309	total: 12.9s	remaining: 315ms
123:	learn: 0.0055915	total: 13s	remaining: 210ms
124:	learn: 0.0055379	total: 13.1s	remaining: 105ms
125:	learn: 0.0055378	total: 13.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.58
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.28
 - F1-Score_Train: 99.28
 - Precision_Test: 8.81
 - Recall_Test: 89.68
 - AUPRC_Test: 71.45
 - Accuracy_Test: 98.42
 - F1-Score_Test: 16.04
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 126
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5056830	total: 77ms	remaining: 9.62s
1:	learn: 0.3721478	total: 157ms	remaining: 9.76s
2:	learn: 0.2822670	total: 238ms	remaining: 9.77s
3:	learn: 0.2370196	total: 335ms	remaining: 10.2s
4:	learn: 0.1958003	total: 417ms	remaining: 10.1s
5:	learn: 0.1619697	total: 488ms	remaining: 9.76s
6:	learn: 0.1387515	total: 581ms	remaining: 9.87s
7:	learn: 0.1256454	total: 655ms	remaining: 9.66s
8:	learn: 0.1137274	total: 728ms	remaining: 9.47s
9:	learn: 0.1036607	total: 845ms	remaining: 9.81s
10:	learn: 0.0941198	total: 926ms	remaining: 9.68s
11:	learn: 0.0874947	total: 1.01s	remaining: 9.63s
12:	learn: 0.0822974	total: 1.11s	remaining: 9.64s
13:	learn: 0.0779497	total: 1.19s	remaining: 9.5s
14:	learn: 0.0715175	total: 1.27s	remaining: 9.41s
15:	learn: 0.0677727	total: 1.36s	remaining: 9.36s
16:	learn: 0.0645825	total: 1.44s	remaining: 9.24s
17:	learn: 0.0623524	total: 1.52s	remaining: 9.11s
18:	learn: 0.0592915	total: 1.62s	remaining: 9.12s
19:	learn: 0.0572705	total: 1.69s	remaining: 8.98s
20:	learn: 0.0551292	total: 1.78s	remaining: 8.89s
21:	learn: 0.0532606	total: 1.89s	remaining: 8.91s
22:	learn: 0.0516645	total: 1.96s	remaining: 8.79s
23:	learn: 0.0502427	total: 2.04s	remaining: 8.68s
24:	learn: 0.0482751	total: 2.14s	remaining: 8.64s
25:	learn: 0.0463007	total: 2.22s	remaining: 8.55s
26:	learn: 0.0449077	total: 2.3s	remaining: 8.44s
27:	learn: 0.0434540	total: 2.4s	remaining: 8.39s
28:	learn: 0.0418425	total: 2.48s	remaining: 8.29s
29:	learn: 0.0407859	total: 2.56s	remaining: 8.18s
30:	learn: 0.0395920	total: 2.65s	remaining: 8.12s
31:	learn: 0.0383052	total: 2.73s	remaining: 8.03s
32:	learn: 0.0374268	total: 2.81s	remaining: 7.92s
33:	learn: 0.0365542	total: 2.92s	remaining: 7.89s
34:	learn: 0.0359391	total: 3s	remaining: 7.81s
35:	learn: 0.0348640	total: 3.08s	remaining: 7.71s
36:	learn: 0.0335822	total: 3.18s	remaining: 7.66s
37:	learn: 0.0326286	total: 3.26s	remaining: 7.56s
38:	learn: 0.0318641	total: 3.34s	remaining: 7.45s
39:	learn: 0.0310772	total: 3.46s	remaining: 7.43s
40:	learn: 0.0300208	total: 3.53s	remaining: 7.33s
41:	learn: 0.0294004	total: 3.61s	remaining: 7.21s
42:	learn: 0.0284293	total: 3.71s	remaining: 7.16s
43:	learn: 0.0277229	total: 3.79s	remaining: 7.07s
44:	learn: 0.0271410	total: 3.88s	remaining: 6.99s
45:	learn: 0.0267518	total: 3.98s	remaining: 6.92s
46:	learn: 0.0263344	total: 4.06s	remaining: 6.82s
47:	learn: 0.0257194	total: 4.16s	remaining: 6.76s
48:	learn: 0.0252281	total: 4.26s	remaining: 6.69s
49:	learn: 0.0245161	total: 4.34s	remaining: 6.6s
50:	learn: 0.0242552	total: 4.41s	remaining: 6.48s
51:	learn: 0.0235408	total: 4.51s	remaining: 6.42s
52:	learn: 0.0232115	total: 4.58s	remaining: 6.31s
53:	learn: 0.0227793	total: 4.66s	remaining: 6.21s
54:	learn: 0.0223827	total: 4.76s	remaining: 6.14s
55:	learn: 0.0218617	total: 4.84s	remaining: 6.04s
56:	learn: 0.0214306	total: 4.93s	remaining: 5.97s
57:	learn: 0.0209062	total: 5.04s	remaining: 5.9s
58:	learn: 0.0204157	total: 5.11s	remaining: 5.81s
59:	learn: 0.0201779	total: 5.24s	remaining: 5.76s
60:	learn: 0.0198206	total: 5.37s	remaining: 5.72s
61:	learn: 0.0195085	total: 5.52s	remaining: 5.7s
62:	learn: 0.0191538	total: 5.67s	remaining: 5.67s
63:	learn: 0.0187650	total: 5.81s	remaining: 5.63s
64:	learn: 0.0184152	total: 5.98s	remaining: 5.61s
65:	learn: 0.0181886	total: 6.11s	remaining: 5.56s
66:	learn: 0.0178696	total: 6.26s	remaining: 5.51s
67:	learn: 0.0175997	total: 6.43s	remaining: 5.49s
68:	learn: 0.0173599	total: 6.6s	remaining: 5.45s
69:	learn: 0.0170238	total: 6.74s	remaining: 5.39s
70:	learn: 0.0166279	total: 6.9s	remaining: 5.34s
71:	learn: 0.0163935	total: 7.05s	remaining: 5.29s
72:	learn: 0.0161551	total: 7.2s	remaining: 5.23s
73:	learn: 0.0157868	total: 7.37s	remaining: 5.17s
74:	learn: 0.0155477	total: 7.51s	remaining: 5.11s
75:	learn: 0.0152733	total: 7.7s	remaining: 5.06s
76:	learn: 0.0150659	total: 7.84s	remaining: 4.99s
77:	learn: 0.0147229	total: 8.02s	remaining: 4.93s
78:	learn: 0.0145114	total: 8.17s	remaining: 4.86s
79:	learn: 0.0143554	total: 8.34s	remaining: 4.79s
80:	learn: 0.0141336	total: 8.48s	remaining: 4.71s
81:	learn: 0.0139095	total: 8.64s	remaining: 4.63s
82:	learn: 0.0136929	total: 8.8s	remaining: 4.56s
83:	learn: 0.0135274	total: 8.94s	remaining: 4.47s
84:	learn: 0.0132896	total: 9.09s	remaining: 4.38s
85:	learn: 0.0131627	total: 9.24s	remaining: 4.3s
86:	learn: 0.0129185	total: 9.4s	remaining: 4.21s
87:	learn: 0.0127442	total: 9.55s	remaining: 4.13s
88:	learn: 0.0125830	total: 9.69s	remaining: 4.03s
89:	learn: 0.0124488	total: 9.86s	remaining: 3.94s
90:	learn: 0.0123007	total: 10s	remaining: 3.85s
91:	learn: 0.0121139	total: 10.2s	remaining: 3.77s
92:	learn: 0.0118941	total: 10.3s	remaining: 3.67s
93:	learn: 0.0117158	total: 10.5s	remaining: 3.58s
94:	learn: 0.0116071	total: 10.6s	remaining: 3.47s
95:	learn: 0.0113694	total: 10.8s	remaining: 3.36s
96:	learn: 0.0112502	total: 10.8s	remaining: 3.23s
97:	learn: 0.0111351	total: 10.9s	remaining: 3.11s
98:	learn: 0.0110128	total: 11s	remaining: 3s
99:	learn: 0.0108243	total: 11.1s	remaining: 2.88s
100:	learn: 0.0107178	total: 11.1s	remaining: 2.76s
101:	learn: 0.0105406	total: 11.3s	remaining: 2.65s
102:	learn: 0.0103580	total: 11.3s	remaining: 2.53s
103:	learn: 0.0102016	total: 11.4s	remaining: 2.42s
104:	learn: 0.0101376	total: 11.5s	remaining: 2.3s
105:	learn: 0.0100452	total: 11.6s	remaining: 2.19s
106:	learn: 0.0099471	total: 11.7s	remaining: 2.07s
107:	learn: 0.0098540	total: 11.8s	remaining: 1.96s
108:	learn: 0.0096782	total: 11.8s	remaining: 1.84s
109:	learn: 0.0095619	total: 11.9s	remaining: 1.73s
110:	learn: 0.0094445	total: 12s	remaining: 1.62s
111:	learn: 0.0093226	total: 12.1s	remaining: 1.51s
112:	learn: 0.0092139	total: 12.2s	remaining: 1.4s
113:	learn: 0.0091062	total: 12.3s	remaining: 1.29s
114:	learn: 0.0089815	total: 12.4s	remaining: 1.18s
115:	learn: 0.0089074	total: 12.4s	remaining: 1.07s
116:	learn: 0.0088373	total: 12.5s	remaining: 964ms
117:	learn: 0.0086935	total: 12.6s	remaining: 855ms
118:	learn: 0.0086029	total: 12.7s	remaining: 746ms
119:	learn: 0.0085200	total: 12.8s	remaining: 640ms
120:	learn: 0.0084421	total: 12.9s	remaining: 532ms
121:	learn: 0.0083577	total: 13s	remaining: 425ms
122:	learn: 0.0081891	total: 13.1s	remaining: 318ms
123:	learn: 0.0081145	total: 13.1s	remaining: 212ms
124:	learn: 0.0080170	total: 13.2s	remaining: 106ms
125:	learn: 0.0079320	total: 13.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.11
 - Recall_Train: 100.00
 - AUPRC_Train: 99.95
 - Accuracy_Train: 99.03
 - F1-Score_Train: 99.04
 - Precision_Test: 7.14
 - Recall_Test: 92.06
 - AUPRC_Test: 65.74
 - Accuracy_Test: 97.97
 - F1-Score_Test: 13.26
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 126
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5124124	total: 90ms	remaining: 11.3s
1:	learn: 0.3610189	total: 171ms	remaining: 10.6s
2:	learn: 0.2574768	total: 255ms	remaining: 10.5s
3:	learn: 0.1983481	total: 347ms	remaining: 10.6s
4:	learn: 0.1674476	total: 418ms	remaining: 10.1s
5:	learn: 0.1380557	total: 496ms	remaining: 9.92s
6:	learn: 0.1193563	total: 592ms	remaining: 10.1s
7:	learn: 0.1082942	total: 672ms	remaining: 9.92s
8:	learn: 0.0966132	total: 750ms	remaining: 9.75s
9:	learn: 0.0876250	total: 845ms	remaining: 9.81s
10:	learn: 0.0782519	total: 924ms	remaining: 9.66s
11:	learn: 0.0736635	total: 1.01s	remaining: 9.56s
12:	learn: 0.0683811	total: 1.1s	remaining: 9.6s
13:	learn: 0.0628473	total: 1.18s	remaining: 9.47s
14:	learn: 0.0590378	total: 1.26s	remaining: 9.34s
15:	learn: 0.0559507	total: 1.36s	remaining: 9.33s
16:	learn: 0.0528253	total: 1.43s	remaining: 9.2s
17:	learn: 0.0502601	total: 1.51s	remaining: 9.09s
18:	learn: 0.0475366	total: 1.63s	remaining: 9.18s
19:	learn: 0.0459295	total: 1.71s	remaining: 9.08s
20:	learn: 0.0439967	total: 1.79s	remaining: 8.93s
21:	learn: 0.0423220	total: 1.89s	remaining: 8.93s
22:	learn: 0.0409226	total: 1.97s	remaining: 8.82s
23:	learn: 0.0399071	total: 2.04s	remaining: 8.68s
24:	learn: 0.0384291	total: 2.16s	remaining: 8.72s
25:	learn: 0.0377958	total: 2.23s	remaining: 8.59s
26:	learn: 0.0361728	total: 2.31s	remaining: 8.48s
27:	learn: 0.0346462	total: 2.41s	remaining: 8.42s
28:	learn: 0.0336194	total: 2.48s	remaining: 8.3s
29:	learn: 0.0325985	total: 2.56s	remaining: 8.21s
30:	learn: 0.0315591	total: 2.67s	remaining: 8.18s
31:	learn: 0.0311865	total: 2.74s	remaining: 8.04s
32:	learn: 0.0302622	total: 2.81s	remaining: 7.93s
33:	learn: 0.0293690	total: 2.91s	remaining: 7.88s
34:	learn: 0.0286451	total: 2.99s	remaining: 7.77s
35:	learn: 0.0276211	total: 3.07s	remaining: 7.67s
36:	learn: 0.0268056	total: 3.19s	remaining: 7.67s
37:	learn: 0.0263471	total: 3.26s	remaining: 7.55s
38:	learn: 0.0257079	total: 3.35s	remaining: 7.46s
39:	learn: 0.0248521	total: 3.44s	remaining: 7.4s
40:	learn: 0.0244674	total: 3.52s	remaining: 7.29s
41:	learn: 0.0238672	total: 3.59s	remaining: 7.18s
42:	learn: 0.0232620	total: 3.7s	remaining: 7.14s
43:	learn: 0.0227891	total: 3.78s	remaining: 7.04s
44:	learn: 0.0221982	total: 3.86s	remaining: 6.95s
45:	learn: 0.0215811	total: 3.96s	remaining: 6.89s
46:	learn: 0.0211373	total: 4.04s	remaining: 6.8s
47:	learn: 0.0207285	total: 4.12s	remaining: 6.7s
48:	learn: 0.0203539	total: 4.23s	remaining: 6.65s
49:	learn: 0.0201206	total: 4.3s	remaining: 6.54s
50:	learn: 0.0197159	total: 4.38s	remaining: 6.44s
51:	learn: 0.0193642	total: 4.48s	remaining: 6.38s
52:	learn: 0.0187830	total: 4.56s	remaining: 6.28s
53:	learn: 0.0183623	total: 4.64s	remaining: 6.19s
54:	learn: 0.0180601	total: 4.75s	remaining: 6.14s
55:	learn: 0.0177851	total: 4.83s	remaining: 6.04s
56:	learn: 0.0173146	total: 4.92s	remaining: 5.95s
57:	learn: 0.0169905	total: 5.01s	remaining: 5.87s
58:	learn: 0.0167526	total: 5.08s	remaining: 5.77s
59:	learn: 0.0163341	total: 5.17s	remaining: 5.68s
60:	learn: 0.0160505	total: 5.28s	remaining: 5.62s
61:	learn: 0.0156559	total: 5.36s	remaining: 5.53s
62:	learn: 0.0154125	total: 5.48s	remaining: 5.48s
63:	learn: 0.0151876	total: 5.63s	remaining: 5.45s
64:	learn: 0.0149865	total: 5.78s	remaining: 5.43s
65:	learn: 0.0148072	total: 5.93s	remaining: 5.39s
66:	learn: 0.0145193	total: 6.09s	remaining: 5.36s
67:	learn: 0.0142621	total: 6.25s	remaining: 5.33s
68:	learn: 0.0141517	total: 6.4s	remaining: 5.29s
69:	learn: 0.0139596	total: 6.54s	remaining: 5.23s
70:	learn: 0.0136783	total: 6.7s	remaining: 5.19s
71:	learn: 0.0135156	total: 6.86s	remaining: 5.14s
72:	learn: 0.0133855	total: 7.01s	remaining: 5.09s
73:	learn: 0.0131211	total: 7.17s	remaining: 5.04s
74:	learn: 0.0129736	total: 7.33s	remaining: 4.98s
75:	learn: 0.0128121	total: 7.46s	remaining: 4.91s
76:	learn: 0.0125992	total: 7.61s	remaining: 4.84s
77:	learn: 0.0123515	total: 7.78s	remaining: 4.78s
78:	learn: 0.0121706	total: 7.93s	remaining: 4.72s
79:	learn: 0.0119276	total: 8.09s	remaining: 4.65s
80:	learn: 0.0118031	total: 8.26s	remaining: 4.59s
81:	learn: 0.0116587	total: 8.43s	remaining: 4.52s
82:	learn: 0.0114903	total: 8.6s	remaining: 4.45s
83:	learn: 0.0113119	total: 8.76s	remaining: 4.38s
84:	learn: 0.0111367	total: 8.92s	remaining: 4.3s
85:	learn: 0.0110678	total: 9.07s	remaining: 4.22s
86:	learn: 0.0108791	total: 9.23s	remaining: 4.14s
87:	learn: 0.0108025	total: 9.38s	remaining: 4.05s
88:	learn: 0.0106150	total: 9.55s	remaining: 3.97s
89:	learn: 0.0104394	total: 9.71s	remaining: 3.88s
90:	learn: 0.0103676	total: 9.89s	remaining: 3.8s
91:	learn: 0.0102401	total: 10s	remaining: 3.71s
92:	learn: 0.0101426	total: 10.2s	remaining: 3.62s
93:	learn: 0.0100266	total: 10.4s	remaining: 3.52s
94:	learn: 0.0099121	total: 10.5s	remaining: 3.44s
95:	learn: 0.0097043	total: 10.7s	remaining: 3.34s
96:	learn: 0.0096166	total: 10.8s	remaining: 3.24s
97:	learn: 0.0094638	total: 11s	remaining: 3.14s
98:	learn: 0.0093376	total: 11.1s	remaining: 3.02s
99:	learn: 0.0092425	total: 11.2s	remaining: 2.9s
100:	learn: 0.0091181	total: 11.2s	remaining: 2.78s
101:	learn: 0.0090096	total: 11.3s	remaining: 2.67s
102:	learn: 0.0089247	total: 11.4s	remaining: 2.55s
103:	learn: 0.0087916	total: 11.5s	remaining: 2.43s
104:	learn: 0.0086673	total: 11.6s	remaining: 2.32s
105:	learn: 0.0085937	total: 11.7s	remaining: 2.21s
106:	learn: 0.0085147	total: 11.8s	remaining: 2.09s
107:	learn: 0.0084290	total: 11.9s	remaining: 1.98s
108:	learn: 0.0083305	total: 11.9s	remaining: 1.86s
109:	learn: 0.0082130	total: 12s	remaining: 1.75s
110:	learn: 0.0081257	total: 12.1s	remaining: 1.64s
111:	learn: 0.0080032	total: 12.2s	remaining: 1.52s
112:	learn: 0.0079311	total: 12.3s	remaining: 1.41s
113:	learn: 0.0078219	total: 12.4s	remaining: 1.3s
114:	learn: 0.0076805	total: 12.4s	remaining: 1.19s
115:	learn: 0.0075551	total: 12.5s	remaining: 1.08s
116:	learn: 0.0074306	total: 12.6s	remaining: 972ms
117:	learn: 0.0073213	total: 12.7s	remaining: 863ms
118:	learn: 0.0072192	total: 12.8s	remaining: 753ms
119:	learn: 0.0071153	total: 12.9s	remaining: 645ms
120:	learn: 0.0070594	total: 13s	remaining: 536ms
121:	learn: 0.0070196	total: 13s	remaining: 427ms
122:	learn: 0.0069066	total: 13.1s	remaining: 320ms
123:	learn: 0.0068503	total: 13.2s	remaining: 213ms
124:	learn: 0.0067819	total: 13.3s	remaining: 106ms
125:	learn: 0.0067818	total: 13.3s	remaining: 0us
[I 2024-12-19 14:40:19,248] Trial 27 finished with value: 69.34335533891107 and parameters: {'learning_rate': 0.07542042576277558, 'max_depth': 5, 'n_estimators': 126, 'scale_pos_weight': 12.209948677934136}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.19
 - Recall_Train: 100.00
 - AUPRC_Train: 99.96
 - Accuracy_Train: 99.08
 - F1-Score_Train: 99.09
 - Precision_Test: 7.39
 - Recall_Test: 88.89
 - AUPRC_Test: 70.84
 - Accuracy_Test: 98.11
 - F1-Score_Test: 13.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 126
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.08
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 12.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 69.3434

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6524236	total: 69.5ms	remaining: 13.7s
1:	learn: 0.6094063	total: 147ms	remaining: 14.4s
2:	learn: 0.5721941	total: 221ms	remaining: 14.4s
3:	learn: 0.5388106	total: 327ms	remaining: 15.9s
4:	learn: 0.5050950	total: 400ms	remaining: 15.5s
5:	learn: 0.4766414	total: 474ms	remaining: 15.2s
6:	learn: 0.4506095	total: 569ms	remaining: 15.5s
7:	learn: 0.4276737	total: 644ms	remaining: 15.3s
8:	learn: 0.4056967	total: 716ms	remaining: 15s
9:	learn: 0.3800352	total: 808ms	remaining: 15.2s
10:	learn: 0.3585369	total: 879ms	remaining: 14.9s
11:	learn: 0.3408142	total: 947ms	remaining: 14.7s
12:	learn: 0.3213119	total: 1.03s	remaining: 14.7s
13:	learn: 0.3025314	total: 1.12s	remaining: 14.7s
14:	learn: 0.2858150	total: 1.19s	remaining: 14.6s
15:	learn: 0.2691216	total: 1.3s	remaining: 14.8s
16:	learn: 0.2571171	total: 1.39s	remaining: 14.8s
17:	learn: 0.2444258	total: 1.46s	remaining: 14.6s
18:	learn: 0.2321251	total: 1.55s	remaining: 14.6s
19:	learn: 0.2208272	total: 1.62s	remaining: 14.4s
20:	learn: 0.2123720	total: 1.69s	remaining: 14.2s
21:	learn: 0.2030998	total: 1.77s	remaining: 14.2s
22:	learn: 0.1945603	total: 1.85s	remaining: 14.1s
23:	learn: 0.1835830	total: 1.93s	remaining: 14s
24:	learn: 0.1777301	total: 2.01s	remaining: 13.9s
25:	learn: 0.1702558	total: 2.09s	remaining: 13.8s
26:	learn: 0.1636624	total: 2.18s	remaining: 13.8s
27:	learn: 0.1582473	total: 2.27s	remaining: 13.8s
28:	learn: 0.1549717	total: 2.34s	remaining: 13.7s
29:	learn: 0.1506329	total: 2.43s	remaining: 13.6s
30:	learn: 0.1477864	total: 2.52s	remaining: 13.6s
31:	learn: 0.1438893	total: 2.59s	remaining: 13.4s
32:	learn: 0.1384542	total: 2.67s	remaining: 13.3s
33:	learn: 0.1335430	total: 2.76s	remaining: 13.3s
34:	learn: 0.1311431	total: 2.83s	remaining: 13.2s
35:	learn: 0.1267832	total: 2.9s	remaining: 13.1s
36:	learn: 0.1250400	total: 2.97s	remaining: 12.9s
37:	learn: 0.1209208	total: 3.05s	remaining: 12.9s
38:	learn: 0.1189585	total: 3.13s	remaining: 12.8s
39:	learn: 0.1156536	total: 3.22s	remaining: 12.7s
40:	learn: 0.1132549	total: 3.32s	remaining: 12.7s
41:	learn: 0.1119054	total: 3.4s	remaining: 12.6s
42:	learn: 0.1101106	total: 3.5s	remaining: 12.6s
43:	learn: 0.1068760	total: 3.58s	remaining: 12.5s
44:	learn: 0.1050801	total: 3.64s	remaining: 12.4s
45:	learn: 0.1020994	total: 3.72s	remaining: 12.3s
46:	learn: 0.1009549	total: 3.8s	remaining: 12.2s
47:	learn: 0.0985131	total: 3.87s	remaining: 12.1s
48:	learn: 0.0973421	total: 3.95s	remaining: 12s
49:	learn: 0.0962809	total: 4.03s	remaining: 11.9s
50:	learn: 0.0939599	total: 4.11s	remaining: 11.8s
51:	learn: 0.0929147	total: 4.2s	remaining: 11.8s
52:	learn: 0.0919669	total: 4.27s	remaining: 11.7s
53:	learn: 0.0906953	total: 4.34s	remaining: 11.6s
54:	learn: 0.0887096	total: 4.41s	remaining: 11.5s
55:	learn: 0.0875962	total: 4.51s	remaining: 11.4s
56:	learn: 0.0864759	total: 4.58s	remaining: 11.3s
57:	learn: 0.0845298	total: 4.68s	remaining: 11.3s
58:	learn: 0.0832669	total: 4.74s	remaining: 11.2s
59:	learn: 0.0822074	total: 4.81s	remaining: 11.1s
60:	learn: 0.0813863	total: 4.88s	remaining: 11s
61:	learn: 0.0801050	total: 4.96s	remaining: 10.9s
62:	learn: 0.0788369	total: 5.03s	remaining: 10.8s
63:	learn: 0.0778956	total: 5.13s	remaining: 10.7s
64:	learn: 0.0768118	total: 5.21s	remaining: 10.7s
65:	learn: 0.0757544	total: 5.27s	remaining: 10.5s
66:	learn: 0.0752160	total: 5.34s	remaining: 10.4s
67:	learn: 0.0745134	total: 5.42s	remaining: 10.4s
68:	learn: 0.0733230	total: 5.52s	remaining: 10.3s
69:	learn: 0.0721652	total: 5.66s	remaining: 10.3s
70:	learn: 0.0716395	total: 5.78s	remaining: 10.3s
71:	learn: 0.0709584	total: 5.91s	remaining: 10.3s
72:	learn: 0.0702245	total: 6.05s	remaining: 10.4s
73:	learn: 0.0694933	total: 6.22s	remaining: 10.4s
74:	learn: 0.0687560	total: 6.36s	remaining: 10.4s
75:	learn: 0.0681213	total: 6.5s	remaining: 10.4s
76:	learn: 0.0675687	total: 6.6s	remaining: 10.4s
77:	learn: 0.0660263	total: 6.74s	remaining: 10.4s
78:	learn: 0.0649619	total: 6.9s	remaining: 10.4s
79:	learn: 0.0644514	total: 7.05s	remaining: 10.4s
80:	learn: 0.0636671	total: 7.22s	remaining: 10.4s
81:	learn: 0.0626993	total: 7.33s	remaining: 10.4s
82:	learn: 0.0622028	total: 7.46s	remaining: 10.3s
83:	learn: 0.0616424	total: 7.59s	remaining: 10.3s
84:	learn: 0.0610969	total: 7.73s	remaining: 10.3s
85:	learn: 0.0605823	total: 7.86s	remaining: 10.2s
86:	learn: 0.0601652	total: 7.99s	remaining: 10.2s
87:	learn: 0.0596588	total: 8.14s	remaining: 10.2s
88:	learn: 0.0590773	total: 8.28s	remaining: 10.1s
89:	learn: 0.0587026	total: 8.41s	remaining: 10.1s
90:	learn: 0.0580398	total: 8.57s	remaining: 10.1s
91:	learn: 0.0574482	total: 8.73s	remaining: 10.1s
92:	learn: 0.0569765	total: 8.88s	remaining: 10s
93:	learn: 0.0564977	total: 9.03s	remaining: 9.99s
94:	learn: 0.0559496	total: 9.17s	remaining: 9.95s
95:	learn: 0.0550552	total: 9.31s	remaining: 9.89s
96:	learn: 0.0546522	total: 9.43s	remaining: 9.82s
97:	learn: 0.0541690	total: 9.57s	remaining: 9.77s
98:	learn: 0.0537859	total: 9.71s	remaining: 9.71s
99:	learn: 0.0532363	total: 9.87s	remaining: 9.67s
100:	learn: 0.0528467	total: 10s	remaining: 9.61s
101:	learn: 0.0520709	total: 10.2s	remaining: 9.57s
102:	learn: 0.0516026	total: 10.3s	remaining: 9.53s
103:	learn: 0.0512881	total: 10.5s	remaining: 9.47s
104:	learn: 0.0507857	total: 10.6s	remaining: 9.4s
105:	learn: 0.0504192	total: 10.8s	remaining: 9.36s
106:	learn: 0.0498625	total: 10.9s	remaining: 9.23s
107:	learn: 0.0494650	total: 10.9s	remaining: 9.11s
108:	learn: 0.0492073	total: 11s	remaining: 8.99s
109:	learn: 0.0488793	total: 11.1s	remaining: 8.88s
110:	learn: 0.0484374	total: 11.2s	remaining: 8.76s
111:	learn: 0.0479499	total: 11.3s	remaining: 8.64s
112:	learn: 0.0475993	total: 11.3s	remaining: 8.52s
113:	learn: 0.0473164	total: 11.4s	remaining: 8.4s
114:	learn: 0.0470277	total: 11.5s	remaining: 8.29s
115:	learn: 0.0467579	total: 11.6s	remaining: 8.17s
116:	learn: 0.0463252	total: 11.6s	remaining: 8.05s
117:	learn: 0.0459302	total: 11.7s	remaining: 7.93s
118:	learn: 0.0456452	total: 11.8s	remaining: 7.82s
119:	learn: 0.0453675	total: 11.9s	remaining: 7.72s
120:	learn: 0.0450821	total: 12s	remaining: 7.61s
121:	learn: 0.0448782	total: 12s	remaining: 7.5s
122:	learn: 0.0444704	total: 12.1s	remaining: 7.38s
123:	learn: 0.0441693	total: 12.2s	remaining: 7.27s
124:	learn: 0.0440203	total: 12.3s	remaining: 7.16s
125:	learn: 0.0437249	total: 12.3s	remaining: 7.04s
126:	learn: 0.0433619	total: 12.4s	remaining: 6.93s
127:	learn: 0.0430477	total: 12.5s	remaining: 6.83s
128:	learn: 0.0427662	total: 12.6s	remaining: 6.72s
129:	learn: 0.0424766	total: 12.7s	remaining: 6.62s
130:	learn: 0.0422406	total: 12.7s	remaining: 6.5s
131:	learn: 0.0420455	total: 12.8s	remaining: 6.39s
132:	learn: 0.0417850	total: 12.9s	remaining: 6.3s
133:	learn: 0.0414519	total: 13s	remaining: 6.19s
134:	learn: 0.0412271	total: 13s	remaining: 6.08s
135:	learn: 0.0410357	total: 13.1s	remaining: 5.98s
136:	learn: 0.0406718	total: 13.2s	remaining: 5.88s
137:	learn: 0.0404278	total: 13.3s	remaining: 5.77s
138:	learn: 0.0402500	total: 13.4s	remaining: 5.67s
139:	learn: 0.0398836	total: 13.4s	remaining: 5.56s
140:	learn: 0.0396373	total: 13.5s	remaining: 5.45s
141:	learn: 0.0394203	total: 13.6s	remaining: 5.35s
142:	learn: 0.0392525	total: 13.7s	remaining: 5.25s
143:	learn: 0.0390754	total: 13.7s	remaining: 5.15s
144:	learn: 0.0388597	total: 13.8s	remaining: 5.05s
145:	learn: 0.0384925	total: 13.9s	remaining: 4.95s
146:	learn: 0.0383248	total: 14s	remaining: 4.85s
147:	learn: 0.0380377	total: 14.1s	remaining: 4.75s
148:	learn: 0.0378528	total: 14.1s	remaining: 4.65s
149:	learn: 0.0376403	total: 14.2s	remaining: 4.54s
150:	learn: 0.0374342	total: 14.3s	remaining: 4.45s
151:	learn: 0.0372557	total: 14.4s	remaining: 4.34s
152:	learn: 0.0370413	total: 14.4s	remaining: 4.24s
153:	learn: 0.0368702	total: 14.5s	remaining: 4.14s
154:	learn: 0.0366805	total: 14.6s	remaining: 4.04s
155:	learn: 0.0365694	total: 14.6s	remaining: 3.94s
156:	learn: 0.0363588	total: 14.7s	remaining: 3.85s
157:	learn: 0.0361669	total: 14.8s	remaining: 3.75s
158:	learn: 0.0359582	total: 14.9s	remaining: 3.65s
159:	learn: 0.0357512	total: 15s	remaining: 3.56s
160:	learn: 0.0355996	total: 15s	remaining: 3.46s
161:	learn: 0.0354133	total: 15.1s	remaining: 3.36s
162:	learn: 0.0352247	total: 15.2s	remaining: 3.26s
163:	learn: 0.0349754	total: 15.3s	remaining: 3.17s
164:	learn: 0.0347491	total: 15.3s	remaining: 3.07s
165:	learn: 0.0346110	total: 15.4s	remaining: 2.97s
166:	learn: 0.0344474	total: 15.5s	remaining: 2.88s
167:	learn: 0.0342581	total: 15.6s	remaining: 2.78s
168:	learn: 0.0341439	total: 15.7s	remaining: 2.69s
169:	learn: 0.0339498	total: 15.7s	remaining: 2.59s
170:	learn: 0.0337980	total: 15.8s	remaining: 2.5s
171:	learn: 0.0336242	total: 15.9s	remaining: 2.4s
172:	learn: 0.0334519	total: 16s	remaining: 2.31s
173:	learn: 0.0333075	total: 16.1s	remaining: 2.21s
174:	learn: 0.0331664	total: 16.1s	remaining: 2.12s
175:	learn: 0.0330022	total: 16.2s	remaining: 2.03s
176:	learn: 0.0329292	total: 16.3s	remaining: 1.93s
177:	learn: 0.0328002	total: 16.4s	remaining: 1.84s
178:	learn: 0.0326378	total: 16.5s	remaining: 1.75s
179:	learn: 0.0324871	total: 16.5s	remaining: 1.65s
180:	learn: 0.0323039	total: 16.6s	remaining: 1.56s
181:	learn: 0.0321592	total: 16.7s	remaining: 1.47s
182:	learn: 0.0320208	total: 16.7s	remaining: 1.37s
183:	learn: 0.0318807	total: 16.8s	remaining: 1.28s
184:	learn: 0.0317149	total: 16.9s	remaining: 1.19s
185:	learn: 0.0315804	total: 17s	remaining: 1.1s
186:	learn: 0.0314201	total: 17.1s	remaining: 1s
187:	learn: 0.0312820	total: 17.2s	remaining: 913ms
188:	learn: 0.0311894	total: 17.2s	remaining: 821ms
189:	learn: 0.0310650	total: 17.3s	remaining: 729ms
190:	learn: 0.0309496	total: 17.4s	remaining: 637ms
191:	learn: 0.0308224	total: 17.5s	remaining: 546ms
192:	learn: 0.0307135	total: 17.6s	remaining: 455ms
193:	learn: 0.0305661	total: 17.6s	remaining: 364ms
194:	learn: 0.0304228	total: 17.7s	remaining: 272ms
195:	learn: 0.0303006	total: 17.8s	remaining: 182ms
196:	learn: 0.0301610	total: 17.9s	remaining: 90.7ms
197:	learn: 0.0300388	total: 17.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 94.20
 - Recall_Train: 100.00
 - AUPRC_Train: 99.84
 - Accuracy_Train: 96.92
 - F1-Score_Train: 97.01
 - Precision_Test: 2.36
 - Recall_Test: 91.27
 - AUPRC_Test: 66.27
 - Accuracy_Test: 93.64
 - F1-Score_Test: 4.61
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 198
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6546882	total: 69.5ms	remaining: 13.7s
1:	learn: 0.6184848	total: 153ms	remaining: 15s
2:	learn: 0.5825217	total: 222ms	remaining: 14.4s
3:	learn: 0.5414643	total: 311ms	remaining: 15.1s
4:	learn: 0.5156378	total: 389ms	remaining: 15s
5:	learn: 0.4957736	total: 452ms	remaining: 14.5s
6:	learn: 0.4685078	total: 537ms	remaining: 14.6s
7:	learn: 0.4445089	total: 629ms	remaining: 14.9s
8:	learn: 0.4250974	total: 700ms	remaining: 14.7s
9:	learn: 0.4029325	total: 810ms	remaining: 15.2s
10:	learn: 0.3812961	total: 925ms	remaining: 15.7s
11:	learn: 0.3619230	total: 1.06s	remaining: 16.4s
12:	learn: 0.3397567	total: 1.21s	remaining: 17.3s
13:	learn: 0.3266574	total: 1.35s	remaining: 17.7s
14:	learn: 0.3121311	total: 1.5s	remaining: 18.3s
15:	learn: 0.2989308	total: 1.64s	remaining: 18.7s
16:	learn: 0.2859089	total: 1.79s	remaining: 19s
17:	learn: 0.2749225	total: 1.95s	remaining: 19.5s
18:	learn: 0.2651806	total: 2.1s	remaining: 19.8s
19:	learn: 0.2543566	total: 2.26s	remaining: 20.1s
20:	learn: 0.2425745	total: 2.41s	remaining: 20.3s
21:	learn: 0.2338275	total: 2.55s	remaining: 20.4s
22:	learn: 0.2264051	total: 2.69s	remaining: 20.4s
23:	learn: 0.2194871	total: 2.82s	remaining: 20.5s
24:	learn: 0.2116433	total: 2.96s	remaining: 20.5s
25:	learn: 0.2062021	total: 3.11s	remaining: 20.6s
26:	learn: 0.2020220	total: 3.25s	remaining: 20.6s
27:	learn: 0.1966437	total: 3.42s	remaining: 20.8s
28:	learn: 0.1887760	total: 3.56s	remaining: 20.8s
29:	learn: 0.1856071	total: 3.71s	remaining: 20.8s
30:	learn: 0.1823940	total: 3.85s	remaining: 20.7s
31:	learn: 0.1771184	total: 4.01s	remaining: 20.8s
32:	learn: 0.1724618	total: 4.14s	remaining: 20.7s
33:	learn: 0.1694996	total: 4.3s	remaining: 20.7s
34:	learn: 0.1672903	total: 4.43s	remaining: 20.6s
35:	learn: 0.1636545	total: 4.57s	remaining: 20.6s
36:	learn: 0.1608678	total: 4.7s	remaining: 20.5s
37:	learn: 0.1579544	total: 4.86s	remaining: 20.5s
38:	learn: 0.1559559	total: 4.99s	remaining: 20.4s
39:	learn: 0.1524492	total: 5.15s	remaining: 20.3s
40:	learn: 0.1494804	total: 5.29s	remaining: 20.3s
41:	learn: 0.1457789	total: 5.46s	remaining: 20.3s
42:	learn: 0.1432031	total: 5.59s	remaining: 20.1s
43:	learn: 0.1413329	total: 5.74s	remaining: 20.1s
44:	learn: 0.1395718	total: 5.81s	remaining: 19.7s
45:	learn: 0.1377920	total: 5.87s	remaining: 19.4s
46:	learn: 0.1354322	total: 5.94s	remaining: 19.1s
47:	learn: 0.1331782	total: 6.01s	remaining: 18.8s
48:	learn: 0.1304188	total: 6.09s	remaining: 18.5s
49:	learn: 0.1290070	total: 6.15s	remaining: 18.2s
50:	learn: 0.1268401	total: 6.23s	remaining: 18s
51:	learn: 0.1245573	total: 6.31s	remaining: 17.7s
52:	learn: 0.1231290	total: 6.38s	remaining: 17.5s
53:	learn: 0.1207661	total: 6.49s	remaining: 17.3s
54:	learn: 0.1183328	total: 6.56s	remaining: 17.1s
55:	learn: 0.1164695	total: 6.63s	remaining: 16.8s
56:	learn: 0.1149912	total: 6.71s	remaining: 16.6s
57:	learn: 0.1131005	total: 6.8s	remaining: 16.4s
58:	learn: 0.1116169	total: 6.87s	remaining: 16.2s
59:	learn: 0.1098757	total: 6.96s	remaining: 16s
60:	learn: 0.1088483	total: 7.03s	remaining: 15.8s
61:	learn: 0.1072500	total: 7.11s	remaining: 15.6s
62:	learn: 0.1057809	total: 7.2s	remaining: 15.4s
63:	learn: 0.1044319	total: 7.27s	remaining: 15.2s
64:	learn: 0.1032073	total: 7.34s	remaining: 15s
65:	learn: 0.1017939	total: 7.43s	remaining: 14.9s
66:	learn: 0.1004755	total: 7.52s	remaining: 14.7s
67:	learn: 0.0995626	total: 7.59s	remaining: 14.5s
68:	learn: 0.0981976	total: 7.68s	remaining: 14.4s
69:	learn: 0.0972245	total: 7.75s	remaining: 14.2s
70:	learn: 0.0964219	total: 7.81s	remaining: 14s
71:	learn: 0.0954596	total: 7.88s	remaining: 13.8s
72:	learn: 0.0944490	total: 7.96s	remaining: 13.6s
73:	learn: 0.0935048	total: 8.03s	remaining: 13.5s
74:	learn: 0.0925036	total: 8.11s	remaining: 13.3s
75:	learn: 0.0913995	total: 8.2s	remaining: 13.2s
76:	learn: 0.0906561	total: 8.27s	remaining: 13s
77:	learn: 0.0897519	total: 8.34s	remaining: 12.8s
78:	learn: 0.0888917	total: 8.43s	remaining: 12.7s
79:	learn: 0.0883003	total: 8.51s	remaining: 12.5s
80:	learn: 0.0873602	total: 8.59s	remaining: 12.4s
81:	learn: 0.0865554	total: 8.68s	remaining: 12.3s
82:	learn: 0.0856929	total: 8.74s	remaining: 12.1s
83:	learn: 0.0851531	total: 8.81s	remaining: 12s
84:	learn: 0.0845785	total: 8.88s	remaining: 11.8s
85:	learn: 0.0836286	total: 8.96s	remaining: 11.7s
86:	learn: 0.0829899	total: 9.04s	remaining: 11.5s
87:	learn: 0.0823514	total: 9.11s	remaining: 11.4s
88:	learn: 0.0819773	total: 9.18s	remaining: 11.2s
89:	learn: 0.0812180	total: 9.28s	remaining: 11.1s
90:	learn: 0.0806219	total: 9.34s	remaining: 11s
91:	learn: 0.0799796	total: 9.43s	remaining: 10.9s
92:	learn: 0.0791585	total: 9.51s	remaining: 10.7s
93:	learn: 0.0784608	total: 9.6s	remaining: 10.6s
94:	learn: 0.0778132	total: 9.69s	remaining: 10.5s
95:	learn: 0.0772820	total: 9.76s	remaining: 10.4s
96:	learn: 0.0768122	total: 9.83s	remaining: 10.2s
97:	learn: 0.0763793	total: 9.91s	remaining: 10.1s
98:	learn: 0.0757915	total: 9.99s	remaining: 9.99s
99:	learn: 0.0752998	total: 10.1s	remaining: 9.87s
100:	learn: 0.0747082	total: 10.2s	remaining: 9.75s
101:	learn: 0.0742023	total: 10.2s	remaining: 9.62s
102:	learn: 0.0737155	total: 10.3s	remaining: 9.49s
103:	learn: 0.0730402	total: 10.4s	remaining: 9.36s
104:	learn: 0.0724504	total: 10.4s	remaining: 9.25s
105:	learn: 0.0720885	total: 10.5s	remaining: 9.12s
106:	learn: 0.0716060	total: 10.6s	remaining: 8.99s
107:	learn: 0.0710784	total: 10.7s	remaining: 8.89s
108:	learn: 0.0706112	total: 10.7s	remaining: 8.77s
109:	learn: 0.0702707	total: 10.8s	remaining: 8.65s
110:	learn: 0.0699888	total: 10.9s	remaining: 8.54s
111:	learn: 0.0695712	total: 11s	remaining: 8.44s
112:	learn: 0.0691722	total: 11.1s	remaining: 8.32s
113:	learn: 0.0686144	total: 11.1s	remaining: 8.21s
114:	learn: 0.0683463	total: 11.2s	remaining: 8.09s
115:	learn: 0.0678920	total: 11.3s	remaining: 7.97s
116:	learn: 0.0674767	total: 11.3s	remaining: 7.85s
117:	learn: 0.0671079	total: 11.4s	remaining: 7.75s
118:	learn: 0.0664213	total: 11.5s	remaining: 7.63s
119:	learn: 0.0660288	total: 11.6s	remaining: 7.52s
120:	learn: 0.0655074	total: 11.7s	remaining: 7.42s
121:	learn: 0.0651473	total: 11.7s	remaining: 7.31s
122:	learn: 0.0649015	total: 11.8s	remaining: 7.2s
123:	learn: 0.0645201	total: 11.9s	remaining: 7.1s
124:	learn: 0.0642334	total: 12s	remaining: 6.98s
125:	learn: 0.0638551	total: 12s	remaining: 6.88s
126:	learn: 0.0635094	total: 12.1s	remaining: 6.79s
127:	learn: 0.0632415	total: 12.2s	remaining: 6.68s
128:	learn: 0.0629148	total: 12.3s	remaining: 6.57s
129:	learn: 0.0624916	total: 12.4s	remaining: 6.47s
130:	learn: 0.0621068	total: 12.4s	remaining: 6.36s
131:	learn: 0.0616986	total: 12.5s	remaining: 6.26s
132:	learn: 0.0614237	total: 12.6s	remaining: 6.16s
133:	learn: 0.0610531	total: 12.7s	remaining: 6.06s
134:	learn: 0.0606715	total: 12.8s	remaining: 5.96s
135:	learn: 0.0602983	total: 12.9s	remaining: 5.86s
136:	learn: 0.0598896	total: 12.9s	remaining: 5.76s
137:	learn: 0.0593893	total: 13s	remaining: 5.65s
138:	learn: 0.0591163	total: 13.1s	remaining: 5.56s
139:	learn: 0.0588000	total: 13.2s	remaining: 5.45s
140:	learn: 0.0584966	total: 13.2s	remaining: 5.35s
141:	learn: 0.0582930	total: 13.4s	remaining: 5.27s
142:	learn: 0.0580578	total: 13.4s	remaining: 5.17s
143:	learn: 0.0577422	total: 13.5s	remaining: 5.07s
144:	learn: 0.0572676	total: 13.6s	remaining: 4.97s
145:	learn: 0.0569705	total: 13.7s	remaining: 4.87s
146:	learn: 0.0565942	total: 13.8s	remaining: 4.77s
147:	learn: 0.0562188	total: 13.9s	remaining: 4.68s
148:	learn: 0.0560042	total: 13.9s	remaining: 4.58s
149:	learn: 0.0556750	total: 14s	remaining: 4.48s
150:	learn: 0.0554107	total: 14.1s	remaining: 4.38s
151:	learn: 0.0550827	total: 14.2s	remaining: 4.29s
152:	learn: 0.0547739	total: 14.2s	remaining: 4.19s
153:	learn: 0.0545440	total: 14.3s	remaining: 4.1s
154:	learn: 0.0543715	total: 14.4s	remaining: 4s
155:	learn: 0.0541583	total: 14.5s	remaining: 3.9s
156:	learn: 0.0539889	total: 14.6s	remaining: 3.8s
157:	learn: 0.0537331	total: 14.6s	remaining: 3.71s
158:	learn: 0.0535242	total: 14.7s	remaining: 3.6s
159:	learn: 0.0532449	total: 14.8s	remaining: 3.51s
160:	learn: 0.0529975	total: 14.9s	remaining: 3.42s
161:	learn: 0.0526508	total: 14.9s	remaining: 3.32s
162:	learn: 0.0524090	total: 15s	remaining: 3.23s
163:	learn: 0.0522106	total: 15.1s	remaining: 3.13s
164:	learn: 0.0519333	total: 15.2s	remaining: 3.04s
165:	learn: 0.0517924	total: 15.2s	remaining: 2.94s
166:	learn: 0.0514948	total: 15.3s	remaining: 2.85s
167:	learn: 0.0512854	total: 15.4s	remaining: 2.75s
168:	learn: 0.0510797	total: 15.5s	remaining: 2.66s
169:	learn: 0.0508801	total: 15.6s	remaining: 2.56s
170:	learn: 0.0506059	total: 15.6s	remaining: 2.47s
171:	learn: 0.0504638	total: 15.7s	remaining: 2.38s
172:	learn: 0.0503045	total: 15.8s	remaining: 2.29s
173:	learn: 0.0500636	total: 16s	remaining: 2.2s
174:	learn: 0.0499212	total: 16.1s	remaining: 2.11s
175:	learn: 0.0497013	total: 16.2s	remaining: 2.02s
176:	learn: 0.0495581	total: 16.3s	remaining: 1.94s
177:	learn: 0.0494074	total: 16.5s	remaining: 1.85s
178:	learn: 0.0492111	total: 16.6s	remaining: 1.76s
179:	learn: 0.0489984	total: 16.8s	remaining: 1.68s
180:	learn: 0.0487343	total: 16.9s	remaining: 1.59s
181:	learn: 0.0485275	total: 17s	remaining: 1.5s
182:	learn: 0.0482943	total: 17.2s	remaining: 1.41s
183:	learn: 0.0481433	total: 17.3s	remaining: 1.32s
184:	learn: 0.0477704	total: 17.5s	remaining: 1.23s
185:	learn: 0.0476357	total: 17.6s	remaining: 1.13s
186:	learn: 0.0473929	total: 17.7s	remaining: 1.04s
187:	learn: 0.0472074	total: 17.9s	remaining: 950ms
188:	learn: 0.0469507	total: 18s	remaining: 858ms
189:	learn: 0.0466516	total: 18.2s	remaining: 764ms
190:	learn: 0.0464929	total: 18.3s	remaining: 670ms
191:	learn: 0.0463333	total: 18.4s	remaining: 576ms
192:	learn: 0.0462207	total: 18.5s	remaining: 480ms
193:	learn: 0.0460421	total: 18.7s	remaining: 385ms
194:	learn: 0.0458706	total: 18.8s	remaining: 290ms
195:	learn: 0.0457639	total: 19s	remaining: 194ms
196:	learn: 0.0455483	total: 19.1s	remaining: 97ms
197:	learn: 0.0453078	total: 19.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 92.23
 - Recall_Train: 100.00
 - AUPRC_Train: 99.75
 - Accuracy_Train: 95.79
 - F1-Score_Train: 95.96
 - Precision_Test: 1.92
 - Recall_Test: 97.62
 - AUPRC_Test: 64.85
 - Accuracy_Test: 91.61
 - F1-Score_Test: 3.77
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 198
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6536731	total: 89.8ms	remaining: 17.7s
1:	learn: 0.6228669	total: 161ms	remaining: 15.8s
2:	learn: 0.5900271	total: 227ms	remaining: 14.8s
3:	learn: 0.5569117	total: 301ms	remaining: 14.6s
4:	learn: 0.5234371	total: 384ms	remaining: 14.8s
5:	learn: 0.5018280	total: 450ms	remaining: 14.4s
6:	learn: 0.4686262	total: 542ms	remaining: 14.8s
7:	learn: 0.4438487	total: 610ms	remaining: 14.5s
8:	learn: 0.4213878	total: 676ms	remaining: 14.2s
9:	learn: 0.4010903	total: 745ms	remaining: 14s
10:	learn: 0.3786475	total: 851ms	remaining: 14.5s
11:	learn: 0.3596762	total: 941ms	remaining: 14.6s
12:	learn: 0.3413216	total: 1.02s	remaining: 14.6s
13:	learn: 0.3281477	total: 1.11s	remaining: 14.5s
14:	learn: 0.3177127	total: 1.17s	remaining: 14.3s
15:	learn: 0.3031381	total: 1.26s	remaining: 14.3s
16:	learn: 0.2893044	total: 1.33s	remaining: 14.2s
17:	learn: 0.2772608	total: 1.4s	remaining: 14s
18:	learn: 0.2640492	total: 1.48s	remaining: 14s
19:	learn: 0.2511614	total: 1.56s	remaining: 13.9s
20:	learn: 0.2395827	total: 1.63s	remaining: 13.8s
21:	learn: 0.2293858	total: 1.72s	remaining: 13.8s
22:	learn: 0.2193960	total: 1.79s	remaining: 13.6s
23:	learn: 0.2101873	total: 1.86s	remaining: 13.5s
24:	learn: 0.2027504	total: 1.95s	remaining: 13.5s
25:	learn: 0.1947369	total: 2.02s	remaining: 13.4s
26:	learn: 0.1871924	total: 2.1s	remaining: 13.3s
27:	learn: 0.1809030	total: 2.19s	remaining: 13.3s
28:	learn: 0.1750546	total: 2.26s	remaining: 13.2s
29:	learn: 0.1698508	total: 2.33s	remaining: 13.1s
30:	learn: 0.1623456	total: 2.42s	remaining: 13.1s
31:	learn: 0.1577117	total: 2.5s	remaining: 13s
32:	learn: 0.1534860	total: 2.57s	remaining: 12.9s
33:	learn: 0.1478592	total: 2.66s	remaining: 12.8s
34:	learn: 0.1441221	total: 2.73s	remaining: 12.7s
35:	learn: 0.1402369	total: 2.79s	remaining: 12.6s
36:	learn: 0.1372843	total: 2.86s	remaining: 12.5s
37:	learn: 0.1342617	total: 2.96s	remaining: 12.5s
38:	learn: 0.1303839	total: 3.03s	remaining: 12.4s
39:	learn: 0.1277916	total: 3.12s	remaining: 12.3s
40:	learn: 0.1240359	total: 3.2s	remaining: 12.2s
41:	learn: 0.1221401	total: 3.26s	remaining: 12.1s
42:	learn: 0.1201320	total: 3.34s	remaining: 12s
43:	learn: 0.1177886	total: 3.42s	remaining: 12s
44:	learn: 0.1159087	total: 3.49s	remaining: 11.9s
45:	learn: 0.1143186	total: 3.58s	remaining: 11.8s
46:	learn: 0.1125785	total: 3.64s	remaining: 11.7s
47:	learn: 0.1112152	total: 3.71s	remaining: 11.6s
48:	learn: 0.1093207	total: 3.78s	remaining: 11.5s
49:	learn: 0.1076087	total: 3.87s	remaining: 11.5s
50:	learn: 0.1063657	total: 3.94s	remaining: 11.4s
51:	learn: 0.1049689	total: 4.02s	remaining: 11.3s
52:	learn: 0.1036132	total: 4.11s	remaining: 11.2s
53:	learn: 0.1020823	total: 4.21s	remaining: 11.2s
54:	learn: 0.0996636	total: 4.28s	remaining: 11.1s
55:	learn: 0.0984588	total: 4.37s	remaining: 11.1s
56:	learn: 0.0970888	total: 4.44s	remaining: 11s
57:	learn: 0.0957487	total: 4.51s	remaining: 10.9s
58:	learn: 0.0946502	total: 4.61s	remaining: 10.9s
59:	learn: 0.0935975	total: 4.68s	remaining: 10.8s
60:	learn: 0.0925063	total: 4.74s	remaining: 10.6s
61:	learn: 0.0912320	total: 4.81s	remaining: 10.5s
62:	learn: 0.0900644	total: 4.9s	remaining: 10.5s
63:	learn: 0.0891464	total: 4.97s	remaining: 10.4s
64:	learn: 0.0878212	total: 5.05s	remaining: 10.3s
65:	learn: 0.0869756	total: 5.11s	remaining: 10.2s
66:	learn: 0.0860556	total: 5.2s	remaining: 10.2s
67:	learn: 0.0848579	total: 5.27s	remaining: 10.1s
68:	learn: 0.0839595	total: 5.37s	remaining: 10s
69:	learn: 0.0832676	total: 5.44s	remaining: 9.95s
70:	learn: 0.0823165	total: 5.51s	remaining: 9.86s
71:	learn: 0.0815490	total: 5.6s	remaining: 9.8s
72:	learn: 0.0808727	total: 5.67s	remaining: 9.71s
73:	learn: 0.0800497	total: 5.73s	remaining: 9.61s
74:	learn: 0.0790391	total: 5.8s	remaining: 9.52s
75:	learn: 0.0784274	total: 5.88s	remaining: 9.44s
76:	learn: 0.0773663	total: 5.96s	remaining: 9.37s
77:	learn: 0.0766184	total: 6.04s	remaining: 9.3s
78:	learn: 0.0760077	total: 6.11s	remaining: 9.2s
79:	learn: 0.0750095	total: 6.18s	remaining: 9.12s
80:	learn: 0.0743960	total: 6.29s	remaining: 9.08s
81:	learn: 0.0736628	total: 6.36s	remaining: 8.99s
82:	learn: 0.0728728	total: 6.42s	remaining: 8.9s
83:	learn: 0.0723797	total: 6.49s	remaining: 8.81s
84:	learn: 0.0718612	total: 6.57s	remaining: 8.74s
85:	learn: 0.0713284	total: 6.64s	remaining: 8.65s
86:	learn: 0.0701303	total: 6.73s	remaining: 8.59s
87:	learn: 0.0695409	total: 6.8s	remaining: 8.5s
88:	learn: 0.0686667	total: 6.87s	remaining: 8.41s
89:	learn: 0.0681156	total: 6.95s	remaining: 8.34s
90:	learn: 0.0675767	total: 7.03s	remaining: 8.27s
91:	learn: 0.0667569	total: 7.11s	remaining: 8.19s
92:	learn: 0.0664107	total: 7.2s	remaining: 8.13s
93:	learn: 0.0659438	total: 7.3s	remaining: 8.07s
94:	learn: 0.0649654	total: 7.37s	remaining: 7.99s
95:	learn: 0.0642984	total: 7.47s	remaining: 7.93s
96:	learn: 0.0638928	total: 7.53s	remaining: 7.84s
97:	learn: 0.0635115	total: 7.6s	remaining: 7.76s
98:	learn: 0.0632037	total: 7.68s	remaining: 7.68s
99:	learn: 0.0625820	total: 7.77s	remaining: 7.61s
100:	learn: 0.0621947	total: 7.83s	remaining: 7.52s
101:	learn: 0.0618735	total: 7.93s	remaining: 7.46s
102:	learn: 0.0614051	total: 8s	remaining: 7.38s
103:	learn: 0.0610451	total: 8.06s	remaining: 7.29s
104:	learn: 0.0607436	total: 8.14s	remaining: 7.21s
105:	learn: 0.0601869	total: 8.22s	remaining: 7.14s
106:	learn: 0.0597789	total: 8.3s	remaining: 7.06s
107:	learn: 0.0592727	total: 8.4s	remaining: 7s
108:	learn: 0.0589139	total: 8.47s	remaining: 6.91s
109:	learn: 0.0583545	total: 8.54s	remaining: 6.83s
110:	learn: 0.0580317	total: 8.61s	remaining: 6.75s
111:	learn: 0.0576370	total: 8.69s	remaining: 6.67s
112:	learn: 0.0573565	total: 8.76s	remaining: 6.59s
113:	learn: 0.0568679	total: 8.85s	remaining: 6.52s
114:	learn: 0.0566222	total: 8.92s	remaining: 6.44s
115:	learn: 0.0561486	total: 8.99s	remaining: 6.36s
116:	learn: 0.0558488	total: 9.07s	remaining: 6.28s
117:	learn: 0.0553344	total: 9.15s	remaining: 6.21s
118:	learn: 0.0549894	total: 9.22s	remaining: 6.12s
119:	learn: 0.0546946	total: 9.33s	remaining: 6.07s
120:	learn: 0.0543198	total: 9.44s	remaining: 6s
121:	learn: 0.0540467	total: 9.56s	remaining: 5.96s
122:	learn: 0.0538622	total: 9.69s	remaining: 5.91s
123:	learn: 0.0536095	total: 9.84s	remaining: 5.87s
124:	learn: 0.0531098	total: 9.98s	remaining: 5.83s
125:	learn: 0.0528063	total: 10.1s	remaining: 5.79s
126:	learn: 0.0525889	total: 10.3s	remaining: 5.75s
127:	learn: 0.0523559	total: 10.4s	remaining: 5.7s
128:	learn: 0.0521464	total: 10.6s	remaining: 5.65s
129:	learn: 0.0519613	total: 10.7s	remaining: 5.6s
130:	learn: 0.0515532	total: 10.9s	remaining: 5.55s
131:	learn: 0.0512790	total: 11s	remaining: 5.5s
132:	learn: 0.0509232	total: 11.1s	remaining: 5.45s
133:	learn: 0.0503587	total: 11.3s	remaining: 5.39s
134:	learn: 0.0500481	total: 11.4s	remaining: 5.33s
135:	learn: 0.0497761	total: 11.6s	remaining: 5.27s
136:	learn: 0.0495166	total: 11.7s	remaining: 5.21s
137:	learn: 0.0492632	total: 11.9s	remaining: 5.16s
138:	learn: 0.0490277	total: 12s	remaining: 5.1s
139:	learn: 0.0487575	total: 12.2s	remaining: 5.04s
140:	learn: 0.0484243	total: 12.3s	remaining: 4.98s
141:	learn: 0.0482602	total: 12.5s	remaining: 4.92s
142:	learn: 0.0479132	total: 12.6s	remaining: 4.86s
143:	learn: 0.0476098	total: 12.8s	remaining: 4.79s
144:	learn: 0.0473010	total: 12.9s	remaining: 4.72s
145:	learn: 0.0469180	total: 13.1s	remaining: 4.65s
146:	learn: 0.0466275	total: 13.2s	remaining: 4.57s
147:	learn: 0.0463577	total: 13.3s	remaining: 4.5s
148:	learn: 0.0462133	total: 13.4s	remaining: 4.42s
149:	learn: 0.0460676	total: 13.6s	remaining: 4.34s
150:	learn: 0.0458775	total: 13.7s	remaining: 4.27s
151:	learn: 0.0456984	total: 13.9s	remaining: 4.19s
152:	learn: 0.0453966	total: 14s	remaining: 4.12s
153:	learn: 0.0451584	total: 14.2s	remaining: 4.04s
154:	learn: 0.0449844	total: 14.3s	remaining: 3.97s
155:	learn: 0.0447867	total: 14.5s	remaining: 3.89s
156:	learn: 0.0445166	total: 14.6s	remaining: 3.81s
157:	learn: 0.0443626	total: 14.8s	remaining: 3.73s
158:	learn: 0.0441787	total: 14.9s	remaining: 3.65s
159:	learn: 0.0440431	total: 15.1s	remaining: 3.57s
160:	learn: 0.0439177	total: 15.2s	remaining: 3.48s
161:	learn: 0.0436983	total: 15.2s	remaining: 3.38s
162:	learn: 0.0435380	total: 15.3s	remaining: 3.29s
163:	learn: 0.0433612	total: 15.4s	remaining: 3.19s
164:	learn: 0.0431928	total: 15.5s	remaining: 3.09s
165:	learn: 0.0430713	total: 15.5s	remaining: 3s
166:	learn: 0.0427758	total: 15.6s	remaining: 2.9s
167:	learn: 0.0426116	total: 15.7s	remaining: 2.8s
168:	learn: 0.0424547	total: 15.8s	remaining: 2.71s
169:	learn: 0.0422079	total: 15.9s	remaining: 2.61s
170:	learn: 0.0419845	total: 15.9s	remaining: 2.52s
171:	learn: 0.0417689	total: 16s	remaining: 2.42s
172:	learn: 0.0413596	total: 16.1s	remaining: 2.33s
173:	learn: 0.0411977	total: 16.2s	remaining: 2.23s
174:	learn: 0.0410839	total: 16.2s	remaining: 2.13s
175:	learn: 0.0408972	total: 16.3s	remaining: 2.04s
176:	learn: 0.0406916	total: 16.4s	remaining: 1.95s
177:	learn: 0.0405791	total: 16.5s	remaining: 1.85s
178:	learn: 0.0403903	total: 16.5s	remaining: 1.76s
179:	learn: 0.0401684	total: 16.6s	remaining: 1.66s
180:	learn: 0.0399895	total: 16.7s	remaining: 1.57s
181:	learn: 0.0397944	total: 16.8s	remaining: 1.48s
182:	learn: 0.0397022	total: 16.9s	remaining: 1.38s
183:	learn: 0.0394661	total: 17s	remaining: 1.29s
184:	learn: 0.0393532	total: 17s	remaining: 1.2s
185:	learn: 0.0392386	total: 17.1s	remaining: 1.1s
186:	learn: 0.0391028	total: 17.2s	remaining: 1.01s
187:	learn: 0.0390029	total: 17.3s	remaining: 918ms
188:	learn: 0.0388045	total: 17.3s	remaining: 825ms
189:	learn: 0.0386604	total: 17.4s	remaining: 734ms
190:	learn: 0.0385668	total: 17.5s	remaining: 642ms
191:	learn: 0.0384335	total: 17.6s	remaining: 549ms
192:	learn: 0.0383070	total: 17.7s	remaining: 458ms
193:	learn: 0.0380465	total: 17.8s	remaining: 366ms
194:	learn: 0.0379140	total: 17.8s	remaining: 274ms
195:	learn: 0.0377435	total: 17.9s	remaining: 183ms
196:	learn: 0.0376012	total: 18s	remaining: 91.3ms
197:	learn: 0.0374738	total: 18s	remaining: 0us
[I 2024-12-19 14:41:21,753] Trial 28 finished with value: 65.1714938148073 and parameters: {'learning_rate': 0.016026776755532404, 'max_depth': 4, 'n_estimators': 198, 'scale_pos_weight': 9.19373718334758}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 93.63
 - Recall_Train: 99.92
 - AUPRC_Train: 99.74
 - Accuracy_Train: 96.56
 - F1-Score_Train: 96.67
 - Precision_Test: 2.17
 - Recall_Test: 91.27
 - AUPRC_Test: 64.40
 - Accuracy_Test: 93.07
 - F1-Score_Test: 4.24
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 4
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 198
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.02
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 9.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 65.1715

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4935286	total: 66.3ms	remaining: 10.1s
1:	learn: 0.3695169	total: 134ms	remaining: 10.1s
2:	learn: 0.2759904	total: 202ms	remaining: 10.1s
3:	learn: 0.2183985	total: 270ms	remaining: 10s
4:	learn: 0.1833070	total: 349ms	remaining: 10.3s
5:	learn: 0.1639299	total: 413ms	remaining: 10.1s
6:	learn: 0.1470237	total: 477ms	remaining: 9.95s
7:	learn: 0.1383021	total: 570ms	remaining: 10.3s
8:	learn: 0.1276789	total: 643ms	remaining: 10.3s
9:	learn: 0.1205171	total: 708ms	remaining: 10.1s
10:	learn: 0.1091996	total: 784ms	remaining: 10.1s
11:	learn: 0.1023672	total: 855ms	remaining: 10s
12:	learn: 0.0973122	total: 915ms	remaining: 9.86s
13:	learn: 0.0924217	total: 978ms	remaining: 9.71s
14:	learn: 0.0880489	total: 1.09s	remaining: 9.99s
15:	learn: 0.0854571	total: 1.15s	remaining: 9.84s
16:	learn: 0.0828540	total: 1.22s	remaining: 9.76s
17:	learn: 0.0796514	total: 1.28s	remaining: 9.62s
18:	learn: 0.0764521	total: 1.37s	remaining: 9.64s
19:	learn: 0.0727780	total: 1.43s	remaining: 9.53s
20:	learn: 0.0713017	total: 1.49s	remaining: 9.38s
21:	learn: 0.0694254	total: 1.55s	remaining: 9.25s
22:	learn: 0.0674044	total: 1.66s	remaining: 9.38s
23:	learn: 0.0652595	total: 1.72s	remaining: 9.26s
24:	learn: 0.0636210	total: 1.78s	remaining: 9.13s
25:	learn: 0.0619416	total: 1.84s	remaining: 9s
26:	learn: 0.0608274	total: 1.92s	remaining: 8.96s
27:	learn: 0.0597592	total: 1.98s	remaining: 8.85s
28:	learn: 0.0581096	total: 2.05s	remaining: 8.75s
29:	learn: 0.0566716	total: 2.11s	remaining: 8.64s
30:	learn: 0.0553608	total: 2.2s	remaining: 8.67s
31:	learn: 0.0540173	total: 2.26s	remaining: 8.56s
32:	learn: 0.0527441	total: 2.33s	remaining: 8.47s
33:	learn: 0.0516526	total: 2.39s	remaining: 8.36s
34:	learn: 0.0503064	total: 2.46s	remaining: 8.3s
35:	learn: 0.0495311	total: 2.53s	remaining: 8.22s
36:	learn: 0.0486704	total: 2.59s	remaining: 8.12s
37:	learn: 0.0481225	total: 2.67s	remaining: 8.07s
38:	learn: 0.0473541	total: 2.74s	remaining: 8.02s
39:	learn: 0.0463282	total: 2.81s	remaining: 7.93s
40:	learn: 0.0456314	total: 2.87s	remaining: 7.84s
41:	learn: 0.0447258	total: 2.95s	remaining: 7.79s
42:	learn: 0.0440485	total: 3s	remaining: 7.69s
43:	learn: 0.0430215	total: 3.07s	remaining: 7.62s
44:	learn: 0.0423794	total: 3.14s	remaining: 7.53s
45:	learn: 0.0417521	total: 3.22s	remaining: 7.5s
46:	learn: 0.0410948	total: 3.29s	remaining: 7.41s
47:	learn: 0.0406970	total: 3.34s	remaining: 7.31s
48:	learn: 0.0401320	total: 3.41s	remaining: 7.23s
49:	learn: 0.0395806	total: 3.49s	remaining: 7.18s
50:	learn: 0.0390164	total: 3.55s	remaining: 7.1s
51:	learn: 0.0383685	total: 3.61s	remaining: 7.02s
52:	learn: 0.0380665	total: 3.7s	remaining: 6.98s
53:	learn: 0.0374351	total: 3.77s	remaining: 6.91s
54:	learn: 0.0369545	total: 3.83s	remaining: 6.83s
55:	learn: 0.0364626	total: 3.89s	remaining: 6.75s
56:	learn: 0.0360754	total: 3.97s	remaining: 6.68s
57:	learn: 0.0356509	total: 4.04s	remaining: 6.61s
58:	learn: 0.0353034	total: 4.1s	remaining: 6.53s
59:	learn: 0.0349745	total: 4.16s	remaining: 6.46s
60:	learn: 0.0345449	total: 4.24s	remaining: 6.39s
61:	learn: 0.0343452	total: 4.29s	remaining: 6.3s
62:	learn: 0.0340112	total: 4.36s	remaining: 6.23s
63:	learn: 0.0336289	total: 4.42s	remaining: 6.15s
64:	learn: 0.0331256	total: 4.5s	remaining: 6.09s
65:	learn: 0.0325771	total: 4.56s	remaining: 6.01s
66:	learn: 0.0321293	total: 4.63s	remaining: 5.94s
67:	learn: 0.0318176	total: 4.71s	remaining: 5.88s
68:	learn: 0.0315096	total: 4.78s	remaining: 5.82s
69:	learn: 0.0311144	total: 4.84s	remaining: 5.74s
70:	learn: 0.0308234	total: 4.9s	remaining: 5.66s
71:	learn: 0.0306413	total: 5.02s	remaining: 5.65s
72:	learn: 0.0300231	total: 5.12s	remaining: 5.61s
73:	learn: 0.0297612	total: 5.25s	remaining: 5.6s
74:	learn: 0.0294713	total: 5.37s	remaining: 5.59s
75:	learn: 0.0291966	total: 5.5s	remaining: 5.57s
76:	learn: 0.0288528	total: 5.61s	remaining: 5.54s
77:	learn: 0.0284704	total: 5.74s	remaining: 5.52s
78:	learn: 0.0282004	total: 5.85s	remaining: 5.48s
79:	learn: 0.0279281	total: 6s	remaining: 5.47s
80:	learn: 0.0276988	total: 6.12s	remaining: 5.44s
81:	learn: 0.0274717	total: 6.27s	remaining: 5.43s
82:	learn: 0.0270843	total: 6.36s	remaining: 5.36s
83:	learn: 0.0266814	total: 6.48s	remaining: 5.33s
84:	learn: 0.0264953	total: 6.61s	remaining: 5.29s
85:	learn: 0.0262589	total: 6.74s	remaining: 5.25s
86:	learn: 0.0259026	total: 6.85s	remaining: 5.2s
87:	learn: 0.0256898	total: 6.99s	remaining: 5.17s
88:	learn: 0.0254973	total: 7.12s	remaining: 5.12s
89:	learn: 0.0252799	total: 7.26s	remaining: 5.08s
90:	learn: 0.0251089	total: 7.39s	remaining: 5.04s
91:	learn: 0.0249290	total: 7.51s	remaining: 4.98s
92:	learn: 0.0247227	total: 7.63s	remaining: 4.92s
93:	learn: 0.0244940	total: 7.74s	remaining: 4.86s
94:	learn: 0.0242738	total: 7.87s	remaining: 4.81s
95:	learn: 0.0238955	total: 8.01s	remaining: 4.76s
96:	learn: 0.0237004	total: 8.14s	remaining: 4.7s
97:	learn: 0.0235342	total: 8.29s	remaining: 4.65s
98:	learn: 0.0233776	total: 8.38s	remaining: 4.57s
99:	learn: 0.0230221	total: 8.5s	remaining: 4.51s
100:	learn: 0.0228083	total: 8.66s	remaining: 4.46s
101:	learn: 0.0226210	total: 8.79s	remaining: 4.39s
102:	learn: 0.0224463	total: 8.87s	remaining: 4.3s
103:	learn: 0.0221611	total: 9s	remaining: 4.24s
104:	learn: 0.0219090	total: 9.11s	remaining: 4.17s
105:	learn: 0.0216481	total: 9.24s	remaining: 4.1s
106:	learn: 0.0214758	total: 9.35s	remaining: 4.02s
107:	learn: 0.0213214	total: 9.48s	remaining: 3.95s
108:	learn: 0.0211717	total: 9.59s	remaining: 3.87s
109:	learn: 0.0210012	total: 9.71s	remaining: 3.8s
110:	learn: 0.0208700	total: 9.83s	remaining: 3.72s
111:	learn: 0.0206065	total: 9.94s	remaining: 3.64s
112:	learn: 0.0204086	total: 10.1s	remaining: 3.56s
113:	learn: 0.0200727	total: 10.2s	remaining: 3.49s
114:	learn: 0.0199003	total: 10.3s	remaining: 3.41s
115:	learn: 0.0196617	total: 10.4s	remaining: 3.33s
116:	learn: 0.0194658	total: 10.5s	remaining: 3.24s
117:	learn: 0.0192751	total: 10.6s	remaining: 3.14s
118:	learn: 0.0191665	total: 10.7s	remaining: 3.04s
119:	learn: 0.0189788	total: 10.7s	remaining: 2.95s
120:	learn: 0.0188668	total: 10.8s	remaining: 2.85s
121:	learn: 0.0186871	total: 10.8s	remaining: 2.76s
122:	learn: 0.0186152	total: 10.9s	remaining: 2.66s
123:	learn: 0.0184584	total: 11s	remaining: 2.57s
124:	learn: 0.0183114	total: 11.1s	remaining: 2.48s
125:	learn: 0.0181999	total: 11.1s	remaining: 2.38s
126:	learn: 0.0180657	total: 11.2s	remaining: 2.29s
127:	learn: 0.0179156	total: 11.3s	remaining: 2.2s
128:	learn: 0.0177245	total: 11.3s	remaining: 2.1s
129:	learn: 0.0174483	total: 11.4s	remaining: 2.01s
130:	learn: 0.0172207	total: 11.5s	remaining: 1.92s
131:	learn: 0.0170795	total: 11.5s	remaining: 1.83s
132:	learn: 0.0169743	total: 11.6s	remaining: 1.74s
133:	learn: 0.0167517	total: 11.6s	remaining: 1.65s
134:	learn: 0.0166765	total: 11.7s	remaining: 1.56s
135:	learn: 0.0165354	total: 11.8s	remaining: 1.47s
136:	learn: 0.0164282	total: 11.8s	remaining: 1.38s
137:	learn: 0.0163420	total: 11.9s	remaining: 1.29s
138:	learn: 0.0161958	total: 12s	remaining: 1.21s
139:	learn: 0.0160689	total: 12s	remaining: 1.12s
140:	learn: 0.0159317	total: 12.1s	remaining: 1.03s
141:	learn: 0.0158157	total: 12.2s	remaining: 943ms
142:	learn: 0.0157082	total: 12.3s	remaining: 857ms
143:	learn: 0.0156218	total: 12.3s	remaining: 769ms
144:	learn: 0.0155259	total: 12.4s	remaining: 682ms
145:	learn: 0.0153797	total: 12.4s	remaining: 596ms
146:	learn: 0.0152826	total: 12.5s	remaining: 511ms
147:	learn: 0.0151632	total: 12.6s	remaining: 425ms
148:	learn: 0.0149426	total: 12.6s	remaining: 339ms
149:	learn: 0.0148275	total: 12.7s	remaining: 254ms
150:	learn: 0.0147606	total: 12.8s	remaining: 170ms
151:	learn: 0.0146910	total: 12.8s	remaining: 84.5ms
152:	learn: 0.0145392	total: 12.9s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.17
 - Recall_Train: 100.00
 - AUPRC_Train: 99.93
 - Accuracy_Train: 99.07
 - F1-Score_Train: 99.08
 - Precision_Test: 7.09
 - Recall_Test: 89.68
 - AUPRC_Test: 66.93
 - Accuracy_Test: 98.01
 - F1-Score_Test: 13.15
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 153
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.70
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5084963	total: 65.7ms	remaining: 9.98s
1:	learn: 0.3692600	total: 131ms	remaining: 9.87s
2:	learn: 0.2918772	total: 193ms	remaining: 9.63s
3:	learn: 0.2439153	total: 275ms	remaining: 10.2s
4:	learn: 0.2203848	total: 350ms	remaining: 10.4s
5:	learn: 0.1997936	total: 413ms	remaining: 10.1s
6:	learn: 0.1752880	total: 486ms	remaining: 10.1s
7:	learn: 0.1627653	total: 567ms	remaining: 10.3s
8:	learn: 0.1561340	total: 622ms	remaining: 9.96s
9:	learn: 0.1481792	total: 693ms	remaining: 9.91s
10:	learn: 0.1433228	total: 763ms	remaining: 9.85s
11:	learn: 0.1346699	total: 825ms	remaining: 9.69s
12:	learn: 0.1312246	total: 885ms	remaining: 9.53s
13:	learn: 0.1266974	total: 962ms	remaining: 9.55s
14:	learn: 0.1229036	total: 1.02s	remaining: 9.43s
15:	learn: 0.1170843	total: 1.09s	remaining: 9.35s
16:	learn: 0.1142871	total: 1.15s	remaining: 9.21s
17:	learn: 0.1114552	total: 1.25s	remaining: 9.35s
18:	learn: 0.1065574	total: 1.31s	remaining: 9.27s
19:	learn: 0.1032964	total: 1.38s	remaining: 9.15s
20:	learn: 0.1006083	total: 1.44s	remaining: 9.04s
21:	learn: 0.0972616	total: 1.52s	remaining: 9.04s
22:	learn: 0.0957499	total: 1.58s	remaining: 8.93s
23:	learn: 0.0938287	total: 1.64s	remaining: 8.84s
24:	learn: 0.0904583	total: 1.71s	remaining: 8.76s
25:	learn: 0.0888756	total: 1.79s	remaining: 8.73s
26:	learn: 0.0868118	total: 1.85s	remaining: 8.62s
27:	learn: 0.0848640	total: 1.91s	remaining: 8.54s
28:	learn: 0.0837065	total: 1.97s	remaining: 8.43s
29:	learn: 0.0819854	total: 2.06s	remaining: 8.44s
30:	learn: 0.0803412	total: 2.12s	remaining: 8.35s
31:	learn: 0.0784228	total: 2.19s	remaining: 8.28s
32:	learn: 0.0772027	total: 2.25s	remaining: 8.18s
33:	learn: 0.0762418	total: 2.35s	remaining: 8.21s
34:	learn: 0.0749407	total: 2.4s	remaining: 8.1s
35:	learn: 0.0737598	total: 2.46s	remaining: 8s
36:	learn: 0.0727652	total: 2.52s	remaining: 7.91s
37:	learn: 0.0716873	total: 2.6s	remaining: 7.89s
38:	learn: 0.0708064	total: 2.67s	remaining: 7.82s
39:	learn: 0.0696519	total: 2.73s	remaining: 7.72s
40:	learn: 0.0685633	total: 2.79s	remaining: 7.64s
41:	learn: 0.0673611	total: 2.89s	remaining: 7.65s
42:	learn: 0.0665727	total: 2.96s	remaining: 7.56s
43:	learn: 0.0657199	total: 3.02s	remaining: 7.48s
44:	learn: 0.0649790	total: 3.08s	remaining: 7.39s
45:	learn: 0.0643540	total: 3.16s	remaining: 7.34s
46:	learn: 0.0636853	total: 3.22s	remaining: 7.26s
47:	learn: 0.0628752	total: 3.28s	remaining: 7.17s
48:	learn: 0.0623805	total: 3.35s	remaining: 7.11s
49:	learn: 0.0617484	total: 3.42s	remaining: 7.05s
50:	learn: 0.0610018	total: 3.48s	remaining: 6.97s
51:	learn: 0.0599250	total: 3.55s	remaining: 6.89s
52:	learn: 0.0594892	total: 3.61s	remaining: 6.8s
53:	learn: 0.0584773	total: 3.69s	remaining: 6.77s
54:	learn: 0.0576119	total: 3.76s	remaining: 6.7s
55:	learn: 0.0568299	total: 3.82s	remaining: 6.61s
56:	learn: 0.0563634	total: 3.87s	remaining: 6.52s
57:	learn: 0.0559685	total: 3.96s	remaining: 6.48s
58:	learn: 0.0554702	total: 4.01s	remaining: 6.4s
59:	learn: 0.0545504	total: 4.08s	remaining: 6.32s
60:	learn: 0.0540580	total: 4.14s	remaining: 6.24s
61:	learn: 0.0531265	total: 4.22s	remaining: 6.19s
62:	learn: 0.0526843	total: 4.28s	remaining: 6.11s
63:	learn: 0.0521365	total: 4.34s	remaining: 6.04s
64:	learn: 0.0516572	total: 4.41s	remaining: 5.97s
65:	learn: 0.0510709	total: 4.5s	remaining: 5.93s
66:	learn: 0.0506757	total: 4.56s	remaining: 5.85s
67:	learn: 0.0501134	total: 4.62s	remaining: 5.78s
68:	learn: 0.0488959	total: 4.71s	remaining: 5.73s
69:	learn: 0.0479376	total: 4.78s	remaining: 5.67s
70:	learn: 0.0474721	total: 4.85s	remaining: 5.6s
71:	learn: 0.0469447	total: 4.91s	remaining: 5.52s
72:	learn: 0.0462131	total: 4.98s	remaining: 5.46s
73:	learn: 0.0457361	total: 5.05s	remaining: 5.39s
74:	learn: 0.0448474	total: 5.1s	remaining: 5.31s
75:	learn: 0.0445237	total: 5.18s	remaining: 5.25s
76:	learn: 0.0442053	total: 5.24s	remaining: 5.17s
77:	learn: 0.0436200	total: 5.3s	remaining: 5.1s
78:	learn: 0.0432348	total: 5.37s	remaining: 5.03s
79:	learn: 0.0427024	total: 5.46s	remaining: 4.98s
80:	learn: 0.0424096	total: 5.52s	remaining: 4.9s
81:	learn: 0.0420445	total: 5.59s	remaining: 4.84s
82:	learn: 0.0416978	total: 5.71s	remaining: 4.81s
83:	learn: 0.0413416	total: 5.8s	remaining: 4.76s
84:	learn: 0.0409938	total: 5.91s	remaining: 4.73s
85:	learn: 0.0404092	total: 6.03s	remaining: 4.7s
86:	learn: 0.0400969	total: 6.15s	remaining: 4.66s
87:	learn: 0.0396253	total: 6.27s	remaining: 4.63s
88:	learn: 0.0390841	total: 6.39s	remaining: 4.59s
89:	learn: 0.0386976	total: 6.53s	remaining: 4.57s
90:	learn: 0.0382631	total: 6.64s	remaining: 4.52s
91:	learn: 0.0379416	total: 6.76s	remaining: 4.48s
92:	learn: 0.0373535	total: 6.88s	remaining: 4.44s
93:	learn: 0.0367151	total: 7s	remaining: 4.39s
94:	learn: 0.0365008	total: 7.12s	remaining: 4.35s
95:	learn: 0.0360919	total: 7.22s	remaining: 4.29s
96:	learn: 0.0357061	total: 7.34s	remaining: 4.24s
97:	learn: 0.0352171	total: 7.46s	remaining: 4.19s
98:	learn: 0.0347158	total: 7.59s	remaining: 4.14s
99:	learn: 0.0345259	total: 7.68s	remaining: 4.07s
100:	learn: 0.0341044	total: 7.81s	remaining: 4.02s
101:	learn: 0.0338541	total: 7.93s	remaining: 3.96s
102:	learn: 0.0336184	total: 8.06s	remaining: 3.91s
103:	learn: 0.0332785	total: 8.18s	remaining: 3.85s
104:	learn: 0.0329711	total: 8.3s	remaining: 3.79s
105:	learn: 0.0327345	total: 8.42s	remaining: 3.73s
106:	learn: 0.0325161	total: 8.55s	remaining: 3.67s
107:	learn: 0.0321344	total: 8.68s	remaining: 3.62s
108:	learn: 0.0320204	total: 8.8s	remaining: 3.55s
109:	learn: 0.0315598	total: 8.92s	remaining: 3.48s
110:	learn: 0.0312753	total: 9.05s	remaining: 3.42s
111:	learn: 0.0310330	total: 9.18s	remaining: 3.36s
112:	learn: 0.0307904	total: 9.3s	remaining: 3.29s
113:	learn: 0.0305163	total: 9.42s	remaining: 3.22s
114:	learn: 0.0303392	total: 9.52s	remaining: 3.15s
115:	learn: 0.0302312	total: 9.63s	remaining: 3.07s
116:	learn: 0.0299300	total: 9.73s	remaining: 2.99s
117:	learn: 0.0296853	total: 9.85s	remaining: 2.92s
118:	learn: 0.0294727	total: 9.97s	remaining: 2.85s
119:	learn: 0.0291263	total: 10.1s	remaining: 2.77s
120:	learn: 0.0288846	total: 10.2s	remaining: 2.7s
121:	learn: 0.0285679	total: 10.3s	remaining: 2.62s
122:	learn: 0.0284472	total: 10.4s	remaining: 2.55s
123:	learn: 0.0282780	total: 10.6s	remaining: 2.47s
124:	learn: 0.0280830	total: 10.7s	remaining: 2.4s
125:	learn: 0.0277153	total: 10.8s	remaining: 2.31s
126:	learn: 0.0273979	total: 10.9s	remaining: 2.22s
127:	learn: 0.0271924	total: 10.9s	remaining: 2.13s
128:	learn: 0.0270156	total: 11s	remaining: 2.04s
129:	learn: 0.0267277	total: 11.1s	remaining: 1.96s
130:	learn: 0.0264152	total: 11.1s	remaining: 1.87s
131:	learn: 0.0260543	total: 11.2s	remaining: 1.78s
132:	learn: 0.0257615	total: 11.3s	remaining: 1.69s
133:	learn: 0.0256195	total: 11.3s	remaining: 1.6s
134:	learn: 0.0253938	total: 11.4s	remaining: 1.52s
135:	learn: 0.0251913	total: 11.4s	remaining: 1.43s
136:	learn: 0.0250107	total: 11.5s	remaining: 1.34s
137:	learn: 0.0248506	total: 11.6s	remaining: 1.26s
138:	learn: 0.0246659	total: 11.6s	remaining: 1.17s
139:	learn: 0.0245732	total: 11.7s	remaining: 1.09s
140:	learn: 0.0243342	total: 11.8s	remaining: 1s
141:	learn: 0.0242007	total: 11.8s	remaining: 918ms
142:	learn: 0.0239937	total: 11.9s	remaining: 833ms
143:	learn: 0.0237905	total: 12s	remaining: 748ms
144:	learn: 0.0235471	total: 12.1s	remaining: 665ms
145:	learn: 0.0233593	total: 12.1s	remaining: 581ms
146:	learn: 0.0231395	total: 12.2s	remaining: 497ms
147:	learn: 0.0230286	total: 12.2s	remaining: 413ms
148:	learn: 0.0228325	total: 12.3s	remaining: 330ms
149:	learn: 0.0226860	total: 12.4s	remaining: 247ms
150:	learn: 0.0225670	total: 12.4s	remaining: 165ms
151:	learn: 0.0223764	total: 12.5s	remaining: 82.1ms
152:	learn: 0.0222708	total: 12.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.46
 - Recall_Train: 99.98
 - AUPRC_Train: 99.95
 - Accuracy_Train: 98.69
 - F1-Score_Train: 98.70
 - Precision_Test: 5.54
 - Recall_Test: 92.06
 - AUPRC_Test: 74.19
 - Accuracy_Test: 97.35
 - F1-Score_Test: 10.46
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 153
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.70
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5026578	total: 63.3ms	remaining: 9.62s
1:	learn: 0.3821891	total: 125ms	remaining: 9.45s
2:	learn: 0.3083219	total: 185ms	remaining: 9.24s
3:	learn: 0.2568833	total: 259ms	remaining: 9.66s
4:	learn: 0.2248941	total: 340ms	remaining: 10s
5:	learn: 0.2001745	total: 397ms	remaining: 9.73s
6:	learn: 0.1820230	total: 458ms	remaining: 9.56s
7:	learn: 0.1624022	total: 514ms	remaining: 9.32s
8:	learn: 0.1448128	total: 591ms	remaining: 9.45s
9:	learn: 0.1306675	total: 648ms	remaining: 9.26s
10:	learn: 0.1251203	total: 715ms	remaining: 9.23s
11:	learn: 0.1183225	total: 776ms	remaining: 9.12s
12:	learn: 0.1122516	total: 857ms	remaining: 9.23s
13:	learn: 0.1056460	total: 922ms	remaining: 9.15s
14:	learn: 0.1017275	total: 987ms	remaining: 9.08s
15:	learn: 0.0988341	total: 1.07s	remaining: 9.14s
16:	learn: 0.0958187	total: 1.14s	remaining: 9.12s
17:	learn: 0.0930608	total: 1.2s	remaining: 9.01s
18:	learn: 0.0876145	total: 1.3s	remaining: 9.21s
19:	learn: 0.0849216	total: 1.36s	remaining: 9.08s
20:	learn: 0.0825307	total: 1.43s	remaining: 8.98s
21:	learn: 0.0793918	total: 1.52s	remaining: 9.03s
22:	learn: 0.0783509	total: 1.6s	remaining: 9.03s
23:	learn: 0.0764404	total: 1.66s	remaining: 8.9s
24:	learn: 0.0751641	total: 1.72s	remaining: 8.83s
25:	learn: 0.0735731	total: 1.8s	remaining: 8.79s
26:	learn: 0.0705227	total: 1.86s	remaining: 8.7s
27:	learn: 0.0689394	total: 1.93s	remaining: 8.62s
28:	learn: 0.0678508	total: 2.02s	remaining: 8.62s
29:	learn: 0.0661673	total: 2.08s	remaining: 8.52s
30:	learn: 0.0655387	total: 2.14s	remaining: 8.43s
31:	learn: 0.0643819	total: 2.22s	remaining: 8.39s
32:	learn: 0.0630102	total: 2.29s	remaining: 8.35s
33:	learn: 0.0621212	total: 2.37s	remaining: 8.29s
34:	learn: 0.0610472	total: 2.44s	remaining: 8.22s
35:	learn: 0.0600902	total: 2.5s	remaining: 8.13s
36:	learn: 0.0593904	total: 2.56s	remaining: 8.03s
37:	learn: 0.0588109	total: 2.66s	remaining: 8.04s
38:	learn: 0.0579322	total: 2.72s	remaining: 7.96s
39:	learn: 0.0572094	total: 2.78s	remaining: 7.87s
40:	learn: 0.0562398	total: 2.84s	remaining: 7.76s
41:	learn: 0.0553123	total: 2.93s	remaining: 7.73s
42:	learn: 0.0546741	total: 2.99s	remaining: 7.65s
43:	learn: 0.0539279	total: 3.06s	remaining: 7.57s
44:	learn: 0.0531230	total: 3.12s	remaining: 7.48s
45:	learn: 0.0519159	total: 3.2s	remaining: 7.45s
46:	learn: 0.0513003	total: 3.26s	remaining: 7.35s
47:	learn: 0.0503436	total: 3.33s	remaining: 7.29s
48:	learn: 0.0499147	total: 3.4s	remaining: 7.21s
49:	learn: 0.0492917	total: 3.47s	remaining: 7.16s
50:	learn: 0.0487812	total: 3.53s	remaining: 7.06s
51:	learn: 0.0482899	total: 3.59s	remaining: 6.98s
52:	learn: 0.0473693	total: 3.65s	remaining: 6.89s
53:	learn: 0.0466380	total: 3.73s	remaining: 6.85s
54:	learn: 0.0461916	total: 3.8s	remaining: 6.76s
55:	learn: 0.0455544	total: 3.85s	remaining: 6.68s
56:	learn: 0.0449546	total: 3.92s	remaining: 6.6s
57:	learn: 0.0444933	total: 4s	remaining: 6.55s
58:	learn: 0.0436668	total: 4.06s	remaining: 6.46s
59:	learn: 0.0434373	total: 4.12s	remaining: 6.38s
60:	learn: 0.0431415	total: 4.18s	remaining: 6.3s
61:	learn: 0.0425391	total: 4.26s	remaining: 6.25s
62:	learn: 0.0421254	total: 4.32s	remaining: 6.17s
63:	learn: 0.0413539	total: 4.39s	remaining: 6.11s
64:	learn: 0.0405464	total: 4.46s	remaining: 6.04s
65:	learn: 0.0400750	total: 4.53s	remaining: 5.98s
66:	learn: 0.0395354	total: 4.59s	remaining: 5.89s
67:	learn: 0.0392428	total: 4.64s	remaining: 5.81s
68:	learn: 0.0387513	total: 4.71s	remaining: 5.73s
69:	learn: 0.0382377	total: 4.79s	remaining: 5.68s
70:	learn: 0.0376545	total: 4.85s	remaining: 5.61s
71:	learn: 0.0373312	total: 4.91s	remaining: 5.52s
72:	learn: 0.0368764	total: 4.97s	remaining: 5.45s
73:	learn: 0.0365536	total: 5.05s	remaining: 5.39s
74:	learn: 0.0361897	total: 5.11s	remaining: 5.32s
75:	learn: 0.0358193	total: 5.17s	remaining: 5.24s
76:	learn: 0.0352779	total: 5.24s	remaining: 5.17s
77:	learn: 0.0349354	total: 5.32s	remaining: 5.11s
78:	learn: 0.0346497	total: 5.39s	remaining: 5.05s
79:	learn: 0.0342942	total: 5.46s	remaining: 4.98s
80:	learn: 0.0340691	total: 5.51s	remaining: 4.9s
81:	learn: 0.0337473	total: 5.61s	remaining: 4.86s
82:	learn: 0.0334494	total: 5.68s	remaining: 4.79s
83:	learn: 0.0330631	total: 5.73s	remaining: 4.71s
84:	learn: 0.0328594	total: 5.79s	remaining: 4.63s
85:	learn: 0.0325281	total: 5.87s	remaining: 4.57s
86:	learn: 0.0323388	total: 5.93s	remaining: 4.5s
87:	learn: 0.0319409	total: 6s	remaining: 4.43s
88:	learn: 0.0315800	total: 6.06s	remaining: 4.36s
89:	learn: 0.0313825	total: 6.13s	remaining: 4.29s
90:	learn: 0.0310690	total: 6.21s	remaining: 4.23s
91:	learn: 0.0309433	total: 6.31s	remaining: 4.19s
92:	learn: 0.0305424	total: 6.43s	remaining: 4.15s
93:	learn: 0.0301705	total: 6.54s	remaining: 4.11s
94:	learn: 0.0299625	total: 6.67s	remaining: 4.07s
95:	learn: 0.0297151	total: 6.79s	remaining: 4.03s
96:	learn: 0.0293221	total: 6.91s	remaining: 3.99s
97:	learn: 0.0289987	total: 7.03s	remaining: 3.94s
98:	learn: 0.0287440	total: 7.16s	remaining: 3.9s
99:	learn: 0.0284612	total: 7.27s	remaining: 3.85s
100:	learn: 0.0281232	total: 7.42s	remaining: 3.82s
101:	learn: 0.0278171	total: 7.52s	remaining: 3.76s
102:	learn: 0.0275626	total: 7.64s	remaining: 3.71s
103:	learn: 0.0273101	total: 7.74s	remaining: 3.65s
104:	learn: 0.0271462	total: 7.82s	remaining: 3.58s
105:	learn: 0.0268146	total: 7.95s	remaining: 3.52s
106:	learn: 0.0265000	total: 8.06s	remaining: 3.46s
107:	learn: 0.0263348	total: 8.19s	remaining: 3.41s
108:	learn: 0.0261093	total: 8.3s	remaining: 3.35s
109:	learn: 0.0259473	total: 8.42s	remaining: 3.29s
110:	learn: 0.0257438	total: 8.54s	remaining: 3.23s
111:	learn: 0.0254714	total: 8.69s	remaining: 3.18s
112:	learn: 0.0252824	total: 8.82s	remaining: 3.12s
113:	learn: 0.0250948	total: 8.96s	remaining: 3.07s
114:	learn: 0.0249370	total: 9.06s	remaining: 2.99s
115:	learn: 0.0246801	total: 9.16s	remaining: 2.92s
116:	learn: 0.0244219	total: 9.3s	remaining: 2.86s
117:	learn: 0.0241977	total: 9.43s	remaining: 2.8s
118:	learn: 0.0240526	total: 9.55s	remaining: 2.73s
119:	learn: 0.0238960	total: 9.67s	remaining: 2.66s
120:	learn: 0.0237677	total: 9.78s	remaining: 2.59s
121:	learn: 0.0235768	total: 9.89s	remaining: 2.51s
122:	learn: 0.0233472	total: 10s	remaining: 2.44s
123:	learn: 0.0232234	total: 10.2s	remaining: 2.37s
124:	learn: 0.0229702	total: 10.3s	remaining: 2.3s
125:	learn: 0.0228445	total: 10.4s	remaining: 2.23s
126:	learn: 0.0226551	total: 10.5s	remaining: 2.15s
127:	learn: 0.0224424	total: 10.7s	remaining: 2.08s
128:	learn: 0.0222218	total: 10.8s	remaining: 2s
129:	learn: 0.0220819	total: 10.9s	remaining: 1.93s
130:	learn: 0.0219516	total: 11s	remaining: 1.85s
131:	learn: 0.0218432	total: 11.1s	remaining: 1.77s
132:	learn: 0.0217154	total: 11.2s	remaining: 1.69s
133:	learn: 0.0214663	total: 11.4s	remaining: 1.61s
134:	learn: 0.0213011	total: 11.5s	remaining: 1.53s
135:	learn: 0.0211688	total: 11.5s	remaining: 1.44s
136:	learn: 0.0210557	total: 11.6s	remaining: 1.35s
137:	learn: 0.0208209	total: 11.7s	remaining: 1.27s
138:	learn: 0.0206939	total: 11.7s	remaining: 1.18s
139:	learn: 0.0205101	total: 11.8s	remaining: 1.09s
140:	learn: 0.0202497	total: 11.9s	remaining: 1.01s
141:	learn: 0.0201351	total: 11.9s	remaining: 924ms
142:	learn: 0.0199570	total: 12s	remaining: 839ms
143:	learn: 0.0198492	total: 12.1s	remaining: 754ms
144:	learn: 0.0196899	total: 12.1s	remaining: 670ms
145:	learn: 0.0195527	total: 12.2s	remaining: 585ms
146:	learn: 0.0193898	total: 12.3s	remaining: 501ms
147:	learn: 0.0192164	total: 12.3s	remaining: 417ms
148:	learn: 0.0191471	total: 12.4s	remaining: 333ms
149:	learn: 0.0190242	total: 12.5s	remaining: 249ms
150:	learn: 0.0189371	total: 12.5s	remaining: 166ms
151:	learn: 0.0188012	total: 12.6s	remaining: 82.8ms
152:	learn: 0.0186160	total: 12.7s	remaining: 0us
[I 2024-12-19 14:42:06,211] Trial 29 finished with value: 71.18715885589056 and parameters: {'learning_rate': 0.09688742604917952, 'max_depth': 3, 'n_estimators': 153, 'scale_pos_weight': 5.703085534477173}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 97.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.94
 - Accuracy_Train: 98.80
 - F1-Score_Train: 98.82
 - Precision_Test: 5.71
 - Recall_Test: 88.89
 - AUPRC_Test: 72.44
 - Accuracy_Test: 97.51
 - F1-Score_Test: 10.73
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 3
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 153
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.70
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 71.1872

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6192777	total: 86ms	remaining: 24.4s
1:	learn: 0.5298423	total: 181ms	remaining: 25.6s
2:	learn: 0.4567616	total: 274ms	remaining: 25.7s
3:	learn: 0.3991627	total: 384ms	remaining: 27s
4:	learn: 0.3561209	total: 473ms	remaining: 26.5s
5:	learn: 0.3162578	total: 565ms	remaining: 26.3s
6:	learn: 0.2813961	total: 678ms	remaining: 26.9s
7:	learn: 0.2521694	total: 769ms	remaining: 26.6s
8:	learn: 0.2240104	total: 862ms	remaining: 26.4s
9:	learn: 0.2027836	total: 972ms	remaining: 26.7s
10:	learn: 0.1835680	total: 1.07s	remaining: 26.8s
11:	learn: 0.1657122	total: 1.17s	remaining: 26.6s
12:	learn: 0.1503128	total: 1.28s	remaining: 26.9s
13:	learn: 0.1388293	total: 1.37s	remaining: 26.6s
14:	learn: 0.1288645	total: 1.47s	remaining: 26.4s
15:	learn: 0.1210663	total: 1.59s	remaining: 26.8s
16:	learn: 0.1138918	total: 1.69s	remaining: 26.6s
17:	learn: 0.1060937	total: 1.78s	remaining: 26.5s
18:	learn: 0.1004892	total: 1.89s	remaining: 26.4s
19:	learn: 0.0965364	total: 1.98s	remaining: 26.2s
20:	learn: 0.0907220	total: 2.08s	remaining: 26.2s
21:	learn: 0.0878237	total: 2.2s	remaining: 26.3s
22:	learn: 0.0828550	total: 2.3s	remaining: 26.2s
23:	learn: 0.0787771	total: 2.38s	remaining: 25.9s
24:	learn: 0.0765606	total: 2.49s	remaining: 25.9s
25:	learn: 0.0744397	total: 2.57s	remaining: 25.6s
26:	learn: 0.0722519	total: 2.66s	remaining: 25.4s
27:	learn: 0.0704349	total: 2.76s	remaining: 25.3s
28:	learn: 0.0682425	total: 2.85s	remaining: 25.2s
29:	learn: 0.0667187	total: 2.94s	remaining: 25s
30:	learn: 0.0644733	total: 3.05s	remaining: 25s
31:	learn: 0.0621734	total: 3.14s	remaining: 24.9s
32:	learn: 0.0606522	total: 3.23s	remaining: 24.7s
33:	learn: 0.0581746	total: 3.34s	remaining: 24.6s
34:	learn: 0.0562553	total: 3.42s	remaining: 24.5s
35:	learn: 0.0544986	total: 3.52s	remaining: 24.3s
36:	learn: 0.0532192	total: 3.63s	remaining: 24.3s
37:	learn: 0.0520660	total: 3.71s	remaining: 24.2s
38:	learn: 0.0510589	total: 3.8s	remaining: 24s
39:	learn: 0.0497671	total: 3.91s	remaining: 23.9s
40:	learn: 0.0484882	total: 4s	remaining: 23.8s
41:	learn: 0.0471685	total: 4.1s	remaining: 23.7s
42:	learn: 0.0462500	total: 4.22s	remaining: 23.7s
43:	learn: 0.0451325	total: 4.31s	remaining: 23.6s
44:	learn: 0.0440848	total: 4.4s	remaining: 23.5s
45:	learn: 0.0432004	total: 4.51s	remaining: 23.4s
46:	learn: 0.0421675	total: 4.6s	remaining: 23.3s
47:	learn: 0.0413560	total: 4.68s	remaining: 23.1s
48:	learn: 0.0404684	total: 4.79s	remaining: 23.1s
49:	learn: 0.0396885	total: 4.88s	remaining: 22.9s
50:	learn: 0.0389058	total: 4.97s	remaining: 22.8s
51:	learn: 0.0383352	total: 5.07s	remaining: 22.7s
52:	learn: 0.0375381	total: 5.18s	remaining: 22.7s
53:	learn: 0.0368336	total: 5.28s	remaining: 22.6s
54:	learn: 0.0360815	total: 5.38s	remaining: 22.5s
55:	learn: 0.0354474	total: 5.47s	remaining: 22.4s
56:	learn: 0.0350683	total: 5.59s	remaining: 22.3s
57:	learn: 0.0343919	total: 5.67s	remaining: 22.2s
58:	learn: 0.0337117	total: 5.76s	remaining: 22.1s
59:	learn: 0.0333561	total: 5.87s	remaining: 22s
60:	learn: 0.0328783	total: 5.96s	remaining: 21.9s
61:	learn: 0.0325159	total: 6.04s	remaining: 21.7s
62:	learn: 0.0320847	total: 6.17s	remaining: 21.7s
63:	learn: 0.0316589	total: 6.26s	remaining: 21.6s
64:	learn: 0.0313182	total: 6.35s	remaining: 21.5s
65:	learn: 0.0307958	total: 6.46s	remaining: 21.4s
66:	learn: 0.0303275	total: 6.54s	remaining: 21.3s
67:	learn: 0.0297817	total: 6.63s	remaining: 21.2s
68:	learn: 0.0293891	total: 6.78s	remaining: 21.2s
69:	learn: 0.0289220	total: 6.92s	remaining: 21.2s
70:	learn: 0.0286235	total: 7.1s	remaining: 21.4s
71:	learn: 0.0281922	total: 7.27s	remaining: 21.5s
72:	learn: 0.0276980	total: 7.44s	remaining: 21.6s
73:	learn: 0.0273285	total: 7.61s	remaining: 21.7s
74:	learn: 0.0268693	total: 7.8s	remaining: 21.8s
75:	learn: 0.0264261	total: 7.98s	remaining: 21.9s
76:	learn: 0.0260703	total: 8.18s	remaining: 22.1s
77:	learn: 0.0257172	total: 8.33s	remaining: 22.1s
78:	learn: 0.0255091	total: 8.5s	remaining: 22.2s
79:	learn: 0.0252449	total: 8.66s	remaining: 22.2s
80:	learn: 0.0249105	total: 8.84s	remaining: 22.3s
81:	learn: 0.0246786	total: 9s	remaining: 22.3s
82:	learn: 0.0243754	total: 9.19s	remaining: 22.4s
83:	learn: 0.0240127	total: 9.36s	remaining: 22.4s
84:	learn: 0.0237030	total: 9.55s	remaining: 22.5s
85:	learn: 0.0233954	total: 9.72s	remaining: 22.5s
86:	learn: 0.0230868	total: 9.93s	remaining: 22.6s
87:	learn: 0.0228590	total: 10.1s	remaining: 22.6s
88:	learn: 0.0226091	total: 10.3s	remaining: 22.6s
89:	learn: 0.0223407	total: 10.4s	remaining: 22.6s
90:	learn: 0.0220880	total: 10.6s	remaining: 22.6s
91:	learn: 0.0218937	total: 10.8s	remaining: 22.6s
92:	learn: 0.0216495	total: 11s	remaining: 22.7s
93:	learn: 0.0213489	total: 11.2s	remaining: 22.7s
94:	learn: 0.0210991	total: 11.3s	remaining: 22.7s
95:	learn: 0.0209181	total: 11.5s	remaining: 22.6s
96:	learn: 0.0207200	total: 11.7s	remaining: 22.6s
97:	learn: 0.0204636	total: 11.9s	remaining: 22.6s
98:	learn: 0.0202461	total: 12.1s	remaining: 22.6s
99:	learn: 0.0199287	total: 12.1s	remaining: 22.5s
100:	learn: 0.0196783	total: 12.2s	remaining: 22.3s
101:	learn: 0.0195215	total: 12.3s	remaining: 22.1s
102:	learn: 0.0192715	total: 12.4s	remaining: 22s
103:	learn: 0.0189678	total: 12.5s	remaining: 21.8s
104:	learn: 0.0187474	total: 12.6s	remaining: 21.7s
105:	learn: 0.0185235	total: 12.7s	remaining: 21.5s
106:	learn: 0.0183450	total: 12.8s	remaining: 21.3s
107:	learn: 0.0181525	total: 12.9s	remaining: 21.2s
108:	learn: 0.0179361	total: 13s	remaining: 21s
109:	learn: 0.0177341	total: 13.1s	remaining: 20.8s
110:	learn: 0.0174933	total: 13.2s	remaining: 20.7s
111:	learn: 0.0172898	total: 13.3s	remaining: 20.5s
112:	learn: 0.0171470	total: 13.4s	remaining: 20.4s
113:	learn: 0.0169647	total: 13.5s	remaining: 20.2s
114:	learn: 0.0168389	total: 13.6s	remaining: 20.1s
115:	learn: 0.0166411	total: 13.7s	remaining: 19.9s
116:	learn: 0.0163858	total: 13.8s	remaining: 19.8s
117:	learn: 0.0162522	total: 13.9s	remaining: 19.6s
118:	learn: 0.0161373	total: 14s	remaining: 19.5s
119:	learn: 0.0160031	total: 14.1s	remaining: 19.4s
120:	learn: 0.0158734	total: 14.2s	remaining: 19.2s
121:	learn: 0.0157439	total: 14.2s	remaining: 19s
122:	learn: 0.0156145	total: 14.3s	remaining: 18.9s
123:	learn: 0.0154037	total: 14.4s	remaining: 18.8s
124:	learn: 0.0152372	total: 14.6s	remaining: 18.6s
125:	learn: 0.0151088	total: 14.7s	remaining: 18.5s
126:	learn: 0.0149587	total: 14.8s	remaining: 18.4s
127:	learn: 0.0148313	total: 14.9s	remaining: 18.2s
128:	learn: 0.0146675	total: 15s	remaining: 18.1s
129:	learn: 0.0145740	total: 15.1s	remaining: 18s
130:	learn: 0.0144364	total: 15.2s	remaining: 17.8s
131:	learn: 0.0143191	total: 15.2s	remaining: 17.7s
132:	learn: 0.0142073	total: 15.3s	remaining: 17.5s
133:	learn: 0.0141150	total: 15.4s	remaining: 17.4s
134:	learn: 0.0140047	total: 15.5s	remaining: 17.3s
135:	learn: 0.0138867	total: 15.6s	remaining: 17.1s
136:	learn: 0.0137462	total: 15.7s	remaining: 17s
137:	learn: 0.0136487	total: 15.8s	remaining: 16.8s
138:	learn: 0.0135369	total: 15.9s	remaining: 16.7s
139:	learn: 0.0134326	total: 16s	remaining: 16.6s
140:	learn: 0.0132234	total: 16.1s	remaining: 16.4s
141:	learn: 0.0131185	total: 16.2s	remaining: 16.3s
142:	learn: 0.0130174	total: 16.3s	remaining: 16.2s
143:	learn: 0.0128982	total: 16.4s	remaining: 16s
144:	learn: 0.0127832	total: 16.5s	remaining: 15.9s
145:	learn: 0.0127079	total: 16.6s	remaining: 15.8s
146:	learn: 0.0125976	total: 16.7s	remaining: 15.6s
147:	learn: 0.0124731	total: 16.8s	remaining: 15.6s
148:	learn: 0.0123394	total: 16.9s	remaining: 15.4s
149:	learn: 0.0121999	total: 17s	remaining: 15.3s
150:	learn: 0.0120710	total: 17.1s	remaining: 15.2s
151:	learn: 0.0119341	total: 17.2s	remaining: 15s
152:	learn: 0.0118173	total: 17.3s	remaining: 14.9s
153:	learn: 0.0117011	total: 17.4s	remaining: 14.8s
154:	learn: 0.0115868	total: 17.5s	remaining: 14.7s
155:	learn: 0.0114321	total: 17.6s	remaining: 14.5s
156:	learn: 0.0113097	total: 17.7s	remaining: 14.4s
157:	learn: 0.0111919	total: 17.8s	remaining: 14.3s
158:	learn: 0.0110795	total: 17.9s	remaining: 14.2s
159:	learn: 0.0109811	total: 18s	remaining: 14s
160:	learn: 0.0108848	total: 18.1s	remaining: 13.9s
161:	learn: 0.0108159	total: 18.2s	remaining: 13.8s
162:	learn: 0.0107690	total: 18.3s	remaining: 13.7s
163:	learn: 0.0106768	total: 18.3s	remaining: 13.5s
164:	learn: 0.0106027	total: 18.4s	remaining: 13.4s
165:	learn: 0.0105621	total: 18.6s	remaining: 13.3s
166:	learn: 0.0104685	total: 18.7s	remaining: 13.2s
167:	learn: 0.0103756	total: 18.7s	remaining: 13.1s
168:	learn: 0.0103173	total: 18.8s	remaining: 12.9s
169:	learn: 0.0102413	total: 18.9s	remaining: 12.8s
170:	learn: 0.0101388	total: 19s	remaining: 12.7s
171:	learn: 0.0100776	total: 19.1s	remaining: 12.6s
172:	learn: 0.0099631	total: 19.2s	remaining: 12.4s
173:	learn: 0.0098713	total: 19.3s	remaining: 12.3s
174:	learn: 0.0097997	total: 19.4s	remaining: 12.2s
175:	learn: 0.0097241	total: 19.5s	remaining: 12.1s
176:	learn: 0.0096326	total: 19.6s	remaining: 11.9s
177:	learn: 0.0096070	total: 19.7s	remaining: 11.8s
178:	learn: 0.0095564	total: 19.8s	remaining: 11.7s
179:	learn: 0.0094859	total: 19.9s	remaining: 11.6s
180:	learn: 0.0094278	total: 20s	remaining: 11.5s
181:	learn: 0.0093627	total: 20.1s	remaining: 11.4s
182:	learn: 0.0092856	total: 20.2s	remaining: 11.2s
183:	learn: 0.0092178	total: 20.3s	remaining: 11.1s
184:	learn: 0.0091643	total: 20.4s	remaining: 11s
185:	learn: 0.0091063	total: 20.4s	remaining: 10.9s
186:	learn: 0.0090374	total: 20.5s	remaining: 10.8s
187:	learn: 0.0089846	total: 20.6s	remaining: 10.6s
188:	learn: 0.0088971	total: 20.7s	remaining: 10.5s
189:	learn: 0.0088217	total: 20.8s	remaining: 10.4s
190:	learn: 0.0087305	total: 20.9s	remaining: 10.3s
191:	learn: 0.0086551	total: 21s	remaining: 10.2s
192:	learn: 0.0085759	total: 21.1s	remaining: 10.1s
193:	learn: 0.0085464	total: 21.2s	remaining: 9.94s
194:	learn: 0.0084901	total: 21.3s	remaining: 9.82s
195:	learn: 0.0084370	total: 21.4s	remaining: 9.71s
196:	learn: 0.0083730	total: 21.5s	remaining: 9.59s
197:	learn: 0.0083181	total: 21.6s	remaining: 9.47s
198:	learn: 0.0082617	total: 21.7s	remaining: 9.37s
199:	learn: 0.0081957	total: 21.8s	remaining: 9.25s
200:	learn: 0.0081376	total: 21.9s	remaining: 9.13s
201:	learn: 0.0081123	total: 21.9s	remaining: 9.02s
202:	learn: 0.0080351	total: 22.1s	remaining: 8.91s
203:	learn: 0.0080052	total: 22.2s	remaining: 8.82s
204:	learn: 0.0079733	total: 22.4s	remaining: 8.73s
205:	learn: 0.0079151	total: 22.6s	remaining: 8.65s
206:	learn: 0.0078535	total: 22.7s	remaining: 8.57s
207:	learn: 0.0077870	total: 22.9s	remaining: 8.48s
208:	learn: 0.0077102	total: 23.1s	remaining: 8.39s
209:	learn: 0.0076824	total: 23.3s	remaining: 8.31s
210:	learn: 0.0076228	total: 23.4s	remaining: 8.22s
211:	learn: 0.0076020	total: 23.6s	remaining: 8.13s
212:	learn: 0.0075526	total: 23.8s	remaining: 8.03s
213:	learn: 0.0074973	total: 23.9s	remaining: 7.94s
214:	learn: 0.0074407	total: 24.1s	remaining: 7.84s
215:	learn: 0.0073962	total: 24.3s	remaining: 7.75s
216:	learn: 0.0073268	total: 24.5s	remaining: 7.66s
217:	learn: 0.0072852	total: 24.6s	remaining: 7.57s
218:	learn: 0.0072324	total: 24.8s	remaining: 7.48s
219:	learn: 0.0072011	total: 25s	remaining: 7.39s
220:	learn: 0.0071763	total: 25.2s	remaining: 7.29s
221:	learn: 0.0071317	total: 25.3s	remaining: 7.19s
222:	learn: 0.0070709	total: 25.5s	remaining: 7.09s
223:	learn: 0.0070346	total: 25.7s	remaining: 7s
224:	learn: 0.0070005	total: 25.9s	remaining: 6.89s
225:	learn: 0.0069774	total: 26s	remaining: 6.8s
226:	learn: 0.0069461	total: 26.2s	remaining: 6.69s
227:	learn: 0.0069055	total: 26.4s	remaining: 6.59s
228:	learn: 0.0068687	total: 26.5s	remaining: 6.49s
229:	learn: 0.0068507	total: 26.7s	remaining: 6.39s
230:	learn: 0.0068295	total: 26.9s	remaining: 6.29s
231:	learn: 0.0067726	total: 27.1s	remaining: 6.18s
232:	learn: 0.0067180	total: 27.3s	remaining: 6.08s
233:	learn: 0.0066730	total: 27.4s	remaining: 5.98s
234:	learn: 0.0066462	total: 27.6s	remaining: 5.86s
235:	learn: 0.0066283	total: 27.7s	remaining: 5.74s
236:	learn: 0.0066122	total: 27.7s	remaining: 5.62s
237:	learn: 0.0065373	total: 27.8s	remaining: 5.5s
238:	learn: 0.0064950	total: 27.9s	remaining: 5.38s
239:	learn: 0.0064634	total: 28s	remaining: 5.26s
240:	learn: 0.0064208	total: 28.1s	remaining: 5.13s
241:	learn: 0.0063916	total: 28.2s	remaining: 5.01s
242:	learn: 0.0063681	total: 28.3s	remaining: 4.89s
243:	learn: 0.0063552	total: 28.4s	remaining: 4.77s
244:	learn: 0.0062987	total: 28.5s	remaining: 4.65s
245:	learn: 0.0062543	total: 28.6s	remaining: 4.53s
246:	learn: 0.0062305	total: 28.7s	remaining: 4.41s
247:	learn: 0.0061983	total: 28.8s	remaining: 4.29s
248:	learn: 0.0061231	total: 28.9s	remaining: 4.17s
249:	learn: 0.0060645	total: 29s	remaining: 4.05s
250:	learn: 0.0060284	total: 29.1s	remaining: 3.94s
251:	learn: 0.0060016	total: 29.2s	remaining: 3.82s
252:	learn: 0.0059741	total: 29.2s	remaining: 3.7s
253:	learn: 0.0059497	total: 29.3s	remaining: 3.58s
254:	learn: 0.0059086	total: 29.4s	remaining: 3.46s
255:	learn: 0.0058934	total: 29.5s	remaining: 3.34s
256:	learn: 0.0058748	total: 29.6s	remaining: 3.23s
257:	learn: 0.0058217	total: 29.7s	remaining: 3.11s
258:	learn: 0.0058061	total: 29.8s	remaining: 2.99s
259:	learn: 0.0057708	total: 29.9s	remaining: 2.88s
260:	learn: 0.0057192	total: 30s	remaining: 2.76s
261:	learn: 0.0057017	total: 30.1s	remaining: 2.64s
262:	learn: 0.0056779	total: 30.2s	remaining: 2.53s
263:	learn: 0.0056445	total: 30.3s	remaining: 2.41s
264:	learn: 0.0056061	total: 30.4s	remaining: 2.29s
265:	learn: 0.0055873	total: 30.5s	remaining: 2.18s
266:	learn: 0.0055627	total: 30.6s	remaining: 2.06s
267:	learn: 0.0055282	total: 30.7s	remaining: 1.95s
268:	learn: 0.0054901	total: 30.8s	remaining: 1.83s
269:	learn: 0.0054695	total: 30.9s	remaining: 1.72s
270:	learn: 0.0054241	total: 31.1s	remaining: 1.61s
271:	learn: 0.0054149	total: 31.3s	remaining: 1.49s
272:	learn: 0.0053847	total: 31.4s	remaining: 1.38s
273:	learn: 0.0053572	total: 31.5s	remaining: 1.26s
274:	learn: 0.0053380	total: 31.6s	remaining: 1.15s
275:	learn: 0.0053033	total: 31.7s	remaining: 1.03s
276:	learn: 0.0052948	total: 31.8s	remaining: 917ms
277:	learn: 0.0052761	total: 31.8s	remaining: 802ms
278:	learn: 0.0052531	total: 31.9s	remaining: 687ms
279:	learn: 0.0052106	total: 32s	remaining: 572ms
280:	learn: 0.0051912	total: 32.2s	remaining: 458ms
281:	learn: 0.0051580	total: 32.3s	remaining: 343ms
282:	learn: 0.0051266	total: 32.4s	remaining: 229ms
283:	learn: 0.0051131	total: 32.5s	remaining: 114ms
284:	learn: 0.0050951	total: 32.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.50
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.75
 - F1-Score_Train: 99.75
 - Precision_Test: 19.33
 - Recall_Test: 87.30
 - AUPRC_Test: 77.71
 - Accuracy_Test: 99.37
 - F1-Score_Test: 31.65
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 285
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.75
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6198503	total: 92.9ms	remaining: 26.4s
1:	learn: 0.5431025	total: 187ms	remaining: 26.4s
2:	learn: 0.4884452	total: 277ms	remaining: 26.1s
3:	learn: 0.4342536	total: 399ms	remaining: 28s
4:	learn: 0.3909642	total: 492ms	remaining: 27.5s
5:	learn: 0.3523016	total: 593ms	remaining: 27.6s
6:	learn: 0.3226028	total: 706ms	remaining: 28s
7:	learn: 0.2916068	total: 798ms	remaining: 27.6s
8:	learn: 0.2679614	total: 881ms	remaining: 27s
9:	learn: 0.2491377	total: 989ms	remaining: 27.2s
10:	learn: 0.2269618	total: 1.08s	remaining: 27s
11:	learn: 0.2092226	total: 1.18s	remaining: 26.8s
12:	learn: 0.1970788	total: 1.3s	remaining: 27.2s
13:	learn: 0.1820028	total: 1.4s	remaining: 27.1s
14:	learn: 0.1721572	total: 1.49s	remaining: 26.8s
15:	learn: 0.1645306	total: 1.61s	remaining: 27.1s
16:	learn: 0.1556575	total: 1.69s	remaining: 26.7s
17:	learn: 0.1487081	total: 1.77s	remaining: 26.3s
18:	learn: 0.1420022	total: 1.89s	remaining: 26.5s
19:	learn: 0.1363131	total: 1.98s	remaining: 26.2s
20:	learn: 0.1309604	total: 2.06s	remaining: 25.9s
21:	learn: 0.1258468	total: 2.17s	remaining: 25.9s
22:	learn: 0.1208257	total: 2.26s	remaining: 25.7s
23:	learn: 0.1160420	total: 2.36s	remaining: 25.6s
24:	learn: 0.1105687	total: 2.47s	remaining: 25.7s
25:	learn: 0.1077112	total: 2.56s	remaining: 25.5s
26:	learn: 0.1047896	total: 2.66s	remaining: 25.4s
27:	learn: 0.1014168	total: 2.77s	remaining: 25.4s
28:	learn: 0.0966273	total: 2.92s	remaining: 25.7s
29:	learn: 0.0929778	total: 3.1s	remaining: 26.3s
30:	learn: 0.0905349	total: 3.25s	remaining: 26.7s
31:	learn: 0.0878076	total: 3.44s	remaining: 27.2s
32:	learn: 0.0853340	total: 3.62s	remaining: 27.7s
33:	learn: 0.0834540	total: 3.8s	remaining: 28.1s
34:	learn: 0.0811260	total: 3.98s	remaining: 28.5s
35:	learn: 0.0794861	total: 4.16s	remaining: 28.8s
36:	learn: 0.0774970	total: 4.34s	remaining: 29.1s
37:	learn: 0.0757311	total: 4.52s	remaining: 29.4s
38:	learn: 0.0739704	total: 4.69s	remaining: 29.6s
39:	learn: 0.0720198	total: 4.88s	remaining: 29.9s
40:	learn: 0.0704354	total: 5.06s	remaining: 30.1s
41:	learn: 0.0692232	total: 5.28s	remaining: 30.5s
42:	learn: 0.0677646	total: 5.47s	remaining: 30.8s
43:	learn: 0.0667407	total: 5.65s	remaining: 30.9s
44:	learn: 0.0650379	total: 5.86s	remaining: 31.3s
45:	learn: 0.0633257	total: 6.07s	remaining: 31.5s
46:	learn: 0.0622315	total: 6.26s	remaining: 31.7s
47:	learn: 0.0610851	total: 6.48s	remaining: 32s
48:	learn: 0.0595015	total: 6.68s	remaining: 32.2s
49:	learn: 0.0583200	total: 6.86s	remaining: 32.2s
50:	learn: 0.0572427	total: 7.04s	remaining: 32.3s
51:	learn: 0.0563915	total: 7.22s	remaining: 32.4s
52:	learn: 0.0552159	total: 7.44s	remaining: 32.6s
53:	learn: 0.0544264	total: 7.62s	remaining: 32.6s
54:	learn: 0.0535104	total: 7.82s	remaining: 32.7s
55:	learn: 0.0524117	total: 8.01s	remaining: 32.8s
56:	learn: 0.0513945	total: 8.22s	remaining: 32.9s
57:	learn: 0.0502851	total: 8.4s	remaining: 32.9s
58:	learn: 0.0496187	total: 8.6s	remaining: 33s
59:	learn: 0.0488418	total: 8.79s	remaining: 33s
60:	learn: 0.0477544	total: 8.98s	remaining: 33s
61:	learn: 0.0469445	total: 9.16s	remaining: 33s
62:	learn: 0.0463510	total: 9.33s	remaining: 32.9s
63:	learn: 0.0456627	total: 9.53s	remaining: 32.9s
64:	learn: 0.0449930	total: 9.7s	remaining: 32.8s
65:	learn: 0.0443924	total: 9.87s	remaining: 32.8s
66:	learn: 0.0438663	total: 10s	remaining: 32.7s
67:	learn: 0.0432874	total: 10.2s	remaining: 32.6s
68:	learn: 0.0428739	total: 10.4s	remaining: 32.6s
69:	learn: 0.0421976	total: 10.6s	remaining: 32.6s
70:	learn: 0.0417426	total: 10.8s	remaining: 32.5s
71:	learn: 0.0410792	total: 10.9s	remaining: 32.4s
72:	learn: 0.0405649	total: 11.1s	remaining: 32.3s
73:	learn: 0.0401255	total: 11.3s	remaining: 32.2s
74:	learn: 0.0396545	total: 11.5s	remaining: 32.1s
75:	learn: 0.0391380	total: 11.6s	remaining: 32s
76:	learn: 0.0387869	total: 11.8s	remaining: 31.9s
77:	learn: 0.0384175	total: 11.9s	remaining: 31.7s
78:	learn: 0.0378604	total: 12.1s	remaining: 31.6s
79:	learn: 0.0373201	total: 12.3s	remaining: 31.5s
80:	learn: 0.0367785	total: 12.5s	remaining: 31.4s
81:	learn: 0.0364094	total: 12.7s	remaining: 31.3s
82:	learn: 0.0361114	total: 12.8s	remaining: 31.2s
83:	learn: 0.0355556	total: 13s	remaining: 31.1s
84:	learn: 0.0350883	total: 13.2s	remaining: 31s
85:	learn: 0.0345733	total: 13.4s	remaining: 31s
86:	learn: 0.0342447	total: 13.6s	remaining: 30.9s
87:	learn: 0.0337895	total: 13.7s	remaining: 30.7s
88:	learn: 0.0335078	total: 13.8s	remaining: 30.4s
89:	learn: 0.0331365	total: 13.9s	remaining: 30.1s
90:	learn: 0.0325570	total: 14s	remaining: 29.8s
91:	learn: 0.0323288	total: 14.1s	remaining: 29.5s
92:	learn: 0.0320587	total: 14.2s	remaining: 29.2s
93:	learn: 0.0315436	total: 14.3s	remaining: 29s
94:	learn: 0.0310983	total: 14.4s	remaining: 28.7s
95:	learn: 0.0307102	total: 14.5s	remaining: 28.5s
96:	learn: 0.0304620	total: 14.6s	remaining: 28.3s
97:	learn: 0.0300932	total: 14.7s	remaining: 28s
98:	learn: 0.0296944	total: 14.8s	remaining: 27.7s
99:	learn: 0.0294481	total: 14.9s	remaining: 27.5s
100:	learn: 0.0291210	total: 15s	remaining: 27.2s
101:	learn: 0.0288122	total: 15s	remaining: 27s
102:	learn: 0.0284710	total: 15.1s	remaining: 26.8s
103:	learn: 0.0281341	total: 15.2s	remaining: 26.5s
104:	learn: 0.0278352	total: 15.3s	remaining: 26.3s
105:	learn: 0.0275363	total: 15.4s	remaining: 26.1s
106:	learn: 0.0272284	total: 15.5s	remaining: 25.8s
107:	learn: 0.0268628	total: 15.6s	remaining: 25.6s
108:	learn: 0.0265362	total: 15.8s	remaining: 25.4s
109:	learn: 0.0261417	total: 15.8s	remaining: 25.2s
110:	learn: 0.0259116	total: 15.9s	remaining: 25s
111:	learn: 0.0257359	total: 16s	remaining: 24.8s
112:	learn: 0.0254958	total: 16.1s	remaining: 24.5s
113:	learn: 0.0252007	total: 16.2s	remaining: 24.3s
114:	learn: 0.0250687	total: 16.3s	remaining: 24.2s
115:	learn: 0.0247449	total: 16.4s	remaining: 23.9s
116:	learn: 0.0244853	total: 16.5s	remaining: 23.7s
117:	learn: 0.0242100	total: 16.6s	remaining: 23.5s
118:	learn: 0.0239466	total: 16.7s	remaining: 23.3s
119:	learn: 0.0237349	total: 16.8s	remaining: 23.1s
120:	learn: 0.0234307	total: 16.9s	remaining: 22.9s
121:	learn: 0.0231522	total: 17s	remaining: 22.7s
122:	learn: 0.0229418	total: 17.1s	remaining: 22.5s
123:	learn: 0.0227205	total: 17.2s	remaining: 22.3s
124:	learn: 0.0224900	total: 17.3s	remaining: 22.1s
125:	learn: 0.0223289	total: 17.4s	remaining: 21.9s
126:	learn: 0.0220445	total: 17.5s	remaining: 21.7s
127:	learn: 0.0218426	total: 17.6s	remaining: 21.6s
128:	learn: 0.0216707	total: 17.7s	remaining: 21.4s
129:	learn: 0.0214872	total: 17.8s	remaining: 21.2s
130:	learn: 0.0213062	total: 17.9s	remaining: 21s
131:	learn: 0.0210925	total: 17.9s	remaining: 20.8s
132:	learn: 0.0209709	total: 18s	remaining: 20.6s
133:	learn: 0.0207901	total: 18.1s	remaining: 20.4s
134:	learn: 0.0206113	total: 18.2s	remaining: 20.2s
135:	learn: 0.0205131	total: 18.3s	remaining: 20.1s
136:	learn: 0.0204225	total: 18.4s	remaining: 19.9s
137:	learn: 0.0202119	total: 18.5s	remaining: 19.7s
138:	learn: 0.0200790	total: 18.6s	remaining: 19.6s
139:	learn: 0.0199139	total: 18.7s	remaining: 19.4s
140:	learn: 0.0197574	total: 18.8s	remaining: 19.2s
141:	learn: 0.0196440	total: 18.9s	remaining: 19s
142:	learn: 0.0195050	total: 19s	remaining: 18.8s
143:	learn: 0.0193704	total: 19.1s	remaining: 18.7s
144:	learn: 0.0191938	total: 19.2s	remaining: 18.5s
145:	learn: 0.0190280	total: 19.2s	remaining: 18.3s
146:	learn: 0.0188977	total: 19.3s	remaining: 18.2s
147:	learn: 0.0187563	total: 19.4s	remaining: 18s
148:	learn: 0.0185805	total: 19.5s	remaining: 17.8s
149:	learn: 0.0184918	total: 19.6s	remaining: 17.7s
150:	learn: 0.0183649	total: 19.8s	remaining: 17.5s
151:	learn: 0.0181811	total: 19.8s	remaining: 17.4s
152:	learn: 0.0180372	total: 19.9s	remaining: 17.2s
153:	learn: 0.0179421	total: 20s	remaining: 17s
154:	learn: 0.0177651	total: 20.1s	remaining: 16.9s
155:	learn: 0.0176639	total: 20.2s	remaining: 16.7s
156:	learn: 0.0174749	total: 20.3s	remaining: 16.6s
157:	learn: 0.0173403	total: 20.4s	remaining: 16.4s
158:	learn: 0.0172184	total: 20.5s	remaining: 16.2s
159:	learn: 0.0170414	total: 20.6s	remaining: 16.1s
160:	learn: 0.0168943	total: 20.7s	remaining: 15.9s
161:	learn: 0.0167428	total: 20.8s	remaining: 15.8s
162:	learn: 0.0166839	total: 21s	remaining: 15.7s
163:	learn: 0.0166102	total: 21.1s	remaining: 15.6s
164:	learn: 0.0165238	total: 21.3s	remaining: 15.5s
165:	learn: 0.0163674	total: 21.5s	remaining: 15.4s
166:	learn: 0.0162665	total: 21.7s	remaining: 15.3s
167:	learn: 0.0160585	total: 21.9s	remaining: 15.2s
168:	learn: 0.0158956	total: 22s	remaining: 15.1s
169:	learn: 0.0157898	total: 22.2s	remaining: 15s
170:	learn: 0.0156487	total: 22.4s	remaining: 14.9s
171:	learn: 0.0155597	total: 22.5s	remaining: 14.8s
172:	learn: 0.0154494	total: 22.7s	remaining: 14.7s
173:	learn: 0.0153595	total: 22.8s	remaining: 14.6s
174:	learn: 0.0152135	total: 23s	remaining: 14.5s
175:	learn: 0.0151531	total: 23.2s	remaining: 14.4s
176:	learn: 0.0150318	total: 23.4s	remaining: 14.2s
177:	learn: 0.0148931	total: 23.5s	remaining: 14.2s
178:	learn: 0.0147612	total: 23.7s	remaining: 14s
179:	learn: 0.0146329	total: 23.9s	remaining: 13.9s
180:	learn: 0.0145401	total: 24s	remaining: 13.8s
181:	learn: 0.0144259	total: 24.2s	remaining: 13.7s
182:	learn: 0.0143070	total: 24.4s	remaining: 13.6s
183:	learn: 0.0141718	total: 24.6s	remaining: 13.5s
184:	learn: 0.0140245	total: 24.8s	remaining: 13.4s
185:	learn: 0.0139044	total: 25s	remaining: 13.3s
186:	learn: 0.0138265	total: 25.1s	remaining: 13.2s
187:	learn: 0.0137526	total: 25.3s	remaining: 13.1s
188:	learn: 0.0136599	total: 25.5s	remaining: 13s
189:	learn: 0.0135491	total: 25.7s	remaining: 12.8s
190:	learn: 0.0134703	total: 25.9s	remaining: 12.7s
191:	learn: 0.0133427	total: 26.1s	remaining: 12.6s
192:	learn: 0.0132343	total: 26.2s	remaining: 12.5s
193:	learn: 0.0131657	total: 26.4s	remaining: 12.4s
194:	learn: 0.0130366	total: 26.5s	remaining: 12.2s
195:	learn: 0.0129224	total: 26.6s	remaining: 12.1s
196:	learn: 0.0128647	total: 26.7s	remaining: 11.9s
197:	learn: 0.0128134	total: 26.8s	remaining: 11.8s
198:	learn: 0.0127462	total: 26.9s	remaining: 11.6s
199:	learn: 0.0126972	total: 27s	remaining: 11.5s
200:	learn: 0.0126285	total: 27.1s	remaining: 11.3s
201:	learn: 0.0125451	total: 27.1s	remaining: 11.2s
202:	learn: 0.0124532	total: 27.3s	remaining: 11s
203:	learn: 0.0123684	total: 27.4s	remaining: 10.9s
204:	learn: 0.0123198	total: 27.4s	remaining: 10.7s
205:	learn: 0.0122587	total: 27.5s	remaining: 10.6s
206:	learn: 0.0121810	total: 27.6s	remaining: 10.4s
207:	learn: 0.0120593	total: 27.7s	remaining: 10.3s
208:	learn: 0.0119414	total: 27.8s	remaining: 10.1s
209:	learn: 0.0118630	total: 27.9s	remaining: 9.97s
210:	learn: 0.0117946	total: 28s	remaining: 9.81s
211:	learn: 0.0117451	total: 28.1s	remaining: 9.68s
212:	learn: 0.0116774	total: 28.2s	remaining: 9.53s
213:	learn: 0.0115443	total: 28.3s	remaining: 9.38s
214:	learn: 0.0114496	total: 28.4s	remaining: 9.24s
215:	learn: 0.0113997	total: 28.5s	remaining: 9.1s
216:	learn: 0.0113132	total: 28.6s	remaining: 8.95s
217:	learn: 0.0112321	total: 28.7s	remaining: 8.81s
218:	learn: 0.0111805	total: 28.8s	remaining: 8.67s
219:	learn: 0.0111051	total: 28.8s	remaining: 8.52s
220:	learn: 0.0110185	total: 28.9s	remaining: 8.38s
221:	learn: 0.0109457	total: 29s	remaining: 8.24s
222:	learn: 0.0108846	total: 29.1s	remaining: 8.1s
223:	learn: 0.0108449	total: 29.2s	remaining: 7.96s
224:	learn: 0.0107781	total: 29.3s	remaining: 7.82s
225:	learn: 0.0107439	total: 29.4s	remaining: 7.68s
226:	learn: 0.0106829	total: 29.5s	remaining: 7.54s
227:	learn: 0.0106467	total: 29.6s	remaining: 7.4s
228:	learn: 0.0105672	total: 29.7s	remaining: 7.26s
229:	learn: 0.0104809	total: 29.8s	remaining: 7.13s
230:	learn: 0.0104155	total: 29.9s	remaining: 6.99s
231:	learn: 0.0103314	total: 30s	remaining: 6.85s
232:	learn: 0.0102881	total: 30.1s	remaining: 6.72s
233:	learn: 0.0102052	total: 30.2s	remaining: 6.58s
234:	learn: 0.0101509	total: 30.3s	remaining: 6.44s
235:	learn: 0.0100932	total: 30.4s	remaining: 6.31s
236:	learn: 0.0100187	total: 30.5s	remaining: 6.17s
237:	learn: 0.0099858	total: 30.6s	remaining: 6.03s
238:	learn: 0.0099476	total: 30.7s	remaining: 5.9s
239:	learn: 0.0099039	total: 30.7s	remaining: 5.76s
240:	learn: 0.0098225	total: 30.8s	remaining: 5.63s
241:	learn: 0.0097551	total: 31s	remaining: 5.5s
242:	learn: 0.0096966	total: 31s	remaining: 5.37s
243:	learn: 0.0096375	total: 31.1s	remaining: 5.23s
244:	learn: 0.0095501	total: 31.3s	remaining: 5.1s
245:	learn: 0.0094691	total: 31.3s	remaining: 4.97s
246:	learn: 0.0094425	total: 31.4s	remaining: 4.83s
247:	learn: 0.0093880	total: 31.5s	remaining: 4.7s
248:	learn: 0.0093274	total: 31.6s	remaining: 4.57s
249:	learn: 0.0092974	total: 31.7s	remaining: 4.44s
250:	learn: 0.0092507	total: 31.8s	remaining: 4.31s
251:	learn: 0.0091781	total: 31.9s	remaining: 4.17s
252:	learn: 0.0090864	total: 32s	remaining: 4.04s
253:	learn: 0.0090160	total: 32.1s	remaining: 3.92s
254:	learn: 0.0089911	total: 32.2s	remaining: 3.79s
255:	learn: 0.0089636	total: 32.3s	remaining: 3.65s
256:	learn: 0.0089480	total: 32.4s	remaining: 3.52s
257:	learn: 0.0089118	total: 32.4s	remaining: 3.4s
258:	learn: 0.0088857	total: 32.5s	remaining: 3.26s
259:	learn: 0.0088437	total: 32.6s	remaining: 3.14s
260:	learn: 0.0088093	total: 32.7s	remaining: 3.01s
261:	learn: 0.0087141	total: 32.8s	remaining: 2.88s
262:	learn: 0.0086763	total: 32.9s	remaining: 2.75s
263:	learn: 0.0086131	total: 33s	remaining: 2.62s
264:	learn: 0.0085466	total: 33.1s	remaining: 2.5s
265:	learn: 0.0085087	total: 33.2s	remaining: 2.37s
266:	learn: 0.0084773	total: 33.3s	remaining: 2.24s
267:	learn: 0.0084170	total: 33.4s	remaining: 2.12s
268:	learn: 0.0083523	total: 33.5s	remaining: 1.99s
269:	learn: 0.0083035	total: 33.6s	remaining: 1.86s
270:	learn: 0.0082311	total: 33.6s	remaining: 1.74s
271:	learn: 0.0081741	total: 33.7s	remaining: 1.61s
272:	learn: 0.0081311	total: 33.8s	remaining: 1.49s
273:	learn: 0.0080813	total: 33.9s	remaining: 1.36s
274:	learn: 0.0080518	total: 34s	remaining: 1.24s
275:	learn: 0.0080151	total: 34.1s	remaining: 1.11s
276:	learn: 0.0079847	total: 34.2s	remaining: 988ms
277:	learn: 0.0079417	total: 34.3s	remaining: 864ms
278:	learn: 0.0079012	total: 34.4s	remaining: 740ms
279:	learn: 0.0078568	total: 34.5s	remaining: 616ms
280:	learn: 0.0078160	total: 34.6s	remaining: 493ms
281:	learn: 0.0077688	total: 34.7s	remaining: 369ms
282:	learn: 0.0077236	total: 34.8s	remaining: 246ms
283:	learn: 0.0076924	total: 34.9s	remaining: 123ms
284:	learn: 0.0076587	total: 35s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.30
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.65
 - F1-Score_Train: 99.65
 - Precision_Test: 16.87
 - Recall_Test: 88.89
 - AUPRC_Test: 73.01
 - Accuracy_Test: 99.24
 - F1-Score_Test: 28.35
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 285
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.75
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6202890	total: 169ms	remaining: 47.9s
1:	learn: 0.5443297	total: 352ms	remaining: 49.8s
2:	learn: 0.4732453	total: 535ms	remaining: 50.3s
3:	learn: 0.4226335	total: 697ms	remaining: 48.9s
4:	learn: 0.3775496	total: 863ms	remaining: 48.3s
5:	learn: 0.3404338	total: 1.02s	remaining: 47.3s
6:	learn: 0.3168753	total: 1.19s	remaining: 47.2s
7:	learn: 0.2838768	total: 1.37s	remaining: 47.3s
8:	learn: 0.2567972	total: 1.54s	remaining: 47.3s
9:	learn: 0.2359861	total: 1.71s	remaining: 46.9s
10:	learn: 0.2176177	total: 1.9s	remaining: 47.4s
11:	learn: 0.1988682	total: 2.08s	remaining: 47.3s
12:	learn: 0.1857134	total: 2.25s	remaining: 47.1s
13:	learn: 0.1729781	total: 2.42s	remaining: 46.9s
14:	learn: 0.1608520	total: 2.59s	remaining: 46.7s
15:	learn: 0.1494138	total: 2.76s	remaining: 46.5s
16:	learn: 0.1415802	total: 2.91s	remaining: 45.9s
17:	learn: 0.1327771	total: 3.08s	remaining: 45.6s
18:	learn: 0.1272074	total: 3.24s	remaining: 45.4s
19:	learn: 0.1206691	total: 3.41s	remaining: 45.2s
20:	learn: 0.1135793	total: 3.59s	remaining: 45.1s
21:	learn: 0.1094339	total: 3.74s	remaining: 44.7s
22:	learn: 0.1055282	total: 3.91s	remaining: 44.5s
23:	learn: 0.1014396	total: 4.09s	remaining: 44.5s
24:	learn: 0.0978358	total: 4.26s	remaining: 44.4s
25:	learn: 0.0935900	total: 4.44s	remaining: 44.2s
26:	learn: 0.0903427	total: 4.63s	remaining: 44.2s
27:	learn: 0.0870436	total: 4.77s	remaining: 43.8s
28:	learn: 0.0850695	total: 4.87s	remaining: 43s
29:	learn: 0.0817436	total: 4.96s	remaining: 42.1s
30:	learn: 0.0797018	total: 5.04s	remaining: 41.3s
31:	learn: 0.0776536	total: 5.16s	remaining: 40.8s
32:	learn: 0.0755323	total: 5.25s	remaining: 40.1s
33:	learn: 0.0736237	total: 5.33s	remaining: 39.4s
34:	learn: 0.0718688	total: 5.44s	remaining: 38.9s
35:	learn: 0.0699900	total: 5.54s	remaining: 38.3s
36:	learn: 0.0681443	total: 5.63s	remaining: 37.7s
37:	learn: 0.0661366	total: 5.75s	remaining: 37.4s
38:	learn: 0.0645209	total: 5.84s	remaining: 36.8s
39:	learn: 0.0629229	total: 5.93s	remaining: 36.3s
40:	learn: 0.0614546	total: 6.04s	remaining: 36s
41:	learn: 0.0600710	total: 6.15s	remaining: 35.6s
42:	learn: 0.0584359	total: 6.24s	remaining: 35.1s
43:	learn: 0.0575367	total: 6.36s	remaining: 34.8s
44:	learn: 0.0566869	total: 6.44s	remaining: 34.4s
45:	learn: 0.0558506	total: 6.52s	remaining: 33.9s
46:	learn: 0.0547075	total: 6.64s	remaining: 33.6s
47:	learn: 0.0539231	total: 6.73s	remaining: 33.2s
48:	learn: 0.0527638	total: 6.81s	remaining: 32.8s
49:	learn: 0.0520475	total: 6.91s	remaining: 32.5s
50:	learn: 0.0509528	total: 7s	remaining: 32.1s
51:	learn: 0.0501041	total: 7.09s	remaining: 31.8s
52:	learn: 0.0494047	total: 7.21s	remaining: 31.5s
53:	learn: 0.0488453	total: 7.29s	remaining: 31.2s
54:	learn: 0.0479232	total: 7.38s	remaining: 30.8s
55:	learn: 0.0468746	total: 7.49s	remaining: 30.6s
56:	learn: 0.0460897	total: 7.58s	remaining: 30.3s
57:	learn: 0.0451370	total: 7.67s	remaining: 30s
58:	learn: 0.0445688	total: 7.77s	remaining: 29.8s
59:	learn: 0.0439689	total: 7.85s	remaining: 29.4s
60:	learn: 0.0432445	total: 7.94s	remaining: 29.2s
61:	learn: 0.0426929	total: 8.04s	remaining: 28.9s
62:	learn: 0.0422355	total: 8.13s	remaining: 28.6s
63:	learn: 0.0414123	total: 8.24s	remaining: 28.4s
64:	learn: 0.0405485	total: 8.35s	remaining: 28.3s
65:	learn: 0.0398167	total: 8.45s	remaining: 28s
66:	learn: 0.0391879	total: 8.53s	remaining: 27.8s
67:	learn: 0.0387413	total: 8.64s	remaining: 27.6s
68:	learn: 0.0379029	total: 8.74s	remaining: 27.4s
69:	learn: 0.0374086	total: 8.84s	remaining: 27.2s
70:	learn: 0.0368994	total: 8.95s	remaining: 27s
71:	learn: 0.0365842	total: 9.03s	remaining: 26.7s
72:	learn: 0.0361572	total: 9.11s	remaining: 26.5s
73:	learn: 0.0357549	total: 9.24s	remaining: 26.3s
74:	learn: 0.0352332	total: 9.33s	remaining: 26.1s
75:	learn: 0.0349163	total: 9.41s	remaining: 25.9s
76:	learn: 0.0344421	total: 9.51s	remaining: 25.7s
77:	learn: 0.0341057	total: 9.6s	remaining: 25.5s
78:	learn: 0.0336310	total: 9.69s	remaining: 25.3s
79:	learn: 0.0333563	total: 9.8s	remaining: 25.1s
80:	learn: 0.0330254	total: 9.89s	remaining: 24.9s
81:	learn: 0.0324352	total: 9.98s	remaining: 24.7s
82:	learn: 0.0321107	total: 10.1s	remaining: 24.6s
83:	learn: 0.0317797	total: 10.2s	remaining: 24.4s
84:	learn: 0.0313632	total: 10.3s	remaining: 24.2s
85:	learn: 0.0310377	total: 10.4s	remaining: 24.1s
86:	learn: 0.0306501	total: 10.5s	remaining: 23.9s
87:	learn: 0.0302370	total: 10.6s	remaining: 23.7s
88:	learn: 0.0297830	total: 10.7s	remaining: 23.6s
89:	learn: 0.0293772	total: 10.8s	remaining: 23.4s
90:	learn: 0.0291203	total: 10.9s	remaining: 23.2s
91:	learn: 0.0288704	total: 11s	remaining: 23s
92:	learn: 0.0284591	total: 11.1s	remaining: 22.9s
93:	learn: 0.0281978	total: 11.2s	remaining: 22.7s
94:	learn: 0.0278972	total: 11.3s	remaining: 22.6s
95:	learn: 0.0276916	total: 11.4s	remaining: 22.4s
96:	learn: 0.0273499	total: 11.5s	remaining: 22.2s
97:	learn: 0.0271748	total: 11.6s	remaining: 22.1s
98:	learn: 0.0269028	total: 11.6s	remaining: 21.9s
99:	learn: 0.0266661	total: 11.7s	remaining: 21.7s
100:	learn: 0.0263111	total: 11.8s	remaining: 21.6s
101:	learn: 0.0261078	total: 11.9s	remaining: 21.4s
102:	learn: 0.0257124	total: 12s	remaining: 21.2s
103:	learn: 0.0253923	total: 12.1s	remaining: 21.1s
104:	learn: 0.0250189	total: 12.2s	remaining: 21s
105:	learn: 0.0248956	total: 12.3s	remaining: 20.8s
106:	learn: 0.0246447	total: 12.4s	remaining: 20.7s
107:	learn: 0.0243512	total: 12.5s	remaining: 20.5s
108:	learn: 0.0239192	total: 12.6s	remaining: 20.4s
109:	learn: 0.0237115	total: 12.7s	remaining: 20.2s
110:	learn: 0.0234345	total: 12.8s	remaining: 20.1s
111:	learn: 0.0233189	total: 12.9s	remaining: 19.9s
112:	learn: 0.0229990	total: 13s	remaining: 19.8s
113:	learn: 0.0227556	total: 13.1s	remaining: 19.7s
114:	learn: 0.0225794	total: 13.2s	remaining: 19.5s
115:	learn: 0.0222860	total: 13.3s	remaining: 19.4s
116:	learn: 0.0220540	total: 13.4s	remaining: 19.2s
117:	learn: 0.0217282	total: 13.5s	remaining: 19.1s
118:	learn: 0.0214610	total: 13.6s	remaining: 19s
119:	learn: 0.0212390	total: 13.7s	remaining: 18.8s
120:	learn: 0.0210243	total: 13.8s	remaining: 18.7s
121:	learn: 0.0207813	total: 13.9s	remaining: 18.6s
122:	learn: 0.0205427	total: 14s	remaining: 18.4s
123:	learn: 0.0203515	total: 14.1s	remaining: 18.3s
124:	learn: 0.0201462	total: 14.2s	remaining: 18.2s
125:	learn: 0.0199040	total: 14.3s	remaining: 18s
126:	learn: 0.0197383	total: 14.4s	remaining: 17.9s
127:	learn: 0.0196285	total: 14.5s	remaining: 17.8s
128:	learn: 0.0193984	total: 14.6s	remaining: 17.6s
129:	learn: 0.0192340	total: 14.6s	remaining: 17.5s
130:	learn: 0.0190142	total: 14.8s	remaining: 17.4s
131:	learn: 0.0188091	total: 15s	remaining: 17.3s
132:	learn: 0.0187357	total: 15.1s	remaining: 17.3s
133:	learn: 0.0186116	total: 15.3s	remaining: 17.2s
134:	learn: 0.0183889	total: 15.5s	remaining: 17.2s
135:	learn: 0.0182141	total: 15.6s	remaining: 17.1s
136:	learn: 0.0180117	total: 15.8s	remaining: 17.1s
137:	learn: 0.0178835	total: 16s	remaining: 17s
138:	learn: 0.0177642	total: 16.2s	remaining: 17s
139:	learn: 0.0175387	total: 16.3s	remaining: 16.9s
140:	learn: 0.0173366	total: 16.5s	remaining: 16.9s
141:	learn: 0.0172216	total: 16.7s	remaining: 16.8s
142:	learn: 0.0170803	total: 16.9s	remaining: 16.8s
143:	learn: 0.0169583	total: 17.1s	remaining: 16.7s
144:	learn: 0.0168802	total: 17.3s	remaining: 16.7s
145:	learn: 0.0167649	total: 17.4s	remaining: 16.6s
146:	learn: 0.0166494	total: 17.6s	remaining: 16.5s
147:	learn: 0.0164852	total: 17.8s	remaining: 16.5s
148:	learn: 0.0162854	total: 18s	remaining: 16.4s
149:	learn: 0.0161738	total: 18.2s	remaining: 16.3s
150:	learn: 0.0160207	total: 18.3s	remaining: 16.3s
151:	learn: 0.0158059	total: 18.5s	remaining: 16.2s
152:	learn: 0.0156735	total: 18.7s	remaining: 16.1s
153:	learn: 0.0156144	total: 18.9s	remaining: 16s
154:	learn: 0.0154880	total: 19s	remaining: 16s
155:	learn: 0.0152882	total: 19.2s	remaining: 15.9s
156:	learn: 0.0151991	total: 19.4s	remaining: 15.8s
157:	learn: 0.0150246	total: 19.6s	remaining: 15.7s
158:	learn: 0.0149252	total: 19.7s	remaining: 15.6s
159:	learn: 0.0147876	total: 19.9s	remaining: 15.6s
160:	learn: 0.0146904	total: 20.1s	remaining: 15.5s
161:	learn: 0.0146382	total: 20.2s	remaining: 15.4s
162:	learn: 0.0145525	total: 20.4s	remaining: 15.2s
163:	learn: 0.0144002	total: 20.5s	remaining: 15.1s
164:	learn: 0.0142928	total: 20.5s	remaining: 14.9s
165:	learn: 0.0141782	total: 20.7s	remaining: 14.8s
166:	learn: 0.0141149	total: 20.8s	remaining: 14.7s
167:	learn: 0.0140583	total: 20.8s	remaining: 14.5s
168:	learn: 0.0139406	total: 20.9s	remaining: 14.4s
169:	learn: 0.0137949	total: 21s	remaining: 14.2s
170:	learn: 0.0137073	total: 21.1s	remaining: 14.1s
171:	learn: 0.0135401	total: 21.2s	remaining: 13.9s
172:	learn: 0.0134097	total: 21.3s	remaining: 13.8s
173:	learn: 0.0132609	total: 21.4s	remaining: 13.7s
174:	learn: 0.0131485	total: 21.5s	remaining: 13.5s
175:	learn: 0.0130159	total: 21.6s	remaining: 13.4s
176:	learn: 0.0129375	total: 21.7s	remaining: 13.2s
177:	learn: 0.0128089	total: 21.9s	remaining: 13.1s
178:	learn: 0.0127041	total: 22s	remaining: 13s
179:	learn: 0.0125813	total: 22.1s	remaining: 12.9s
180:	learn: 0.0125145	total: 22.2s	remaining: 12.7s
181:	learn: 0.0124059	total: 22.2s	remaining: 12.6s
182:	learn: 0.0122764	total: 22.3s	remaining: 12.4s
183:	learn: 0.0122018	total: 22.4s	remaining: 12.3s
184:	learn: 0.0121034	total: 22.5s	remaining: 12.2s
185:	learn: 0.0120085	total: 22.6s	remaining: 12s
186:	learn: 0.0119300	total: 22.7s	remaining: 11.9s
187:	learn: 0.0118180	total: 22.8s	remaining: 11.8s
188:	learn: 0.0117306	total: 22.9s	remaining: 11.6s
189:	learn: 0.0116371	total: 23s	remaining: 11.5s
190:	learn: 0.0115676	total: 23.1s	remaining: 11.4s
191:	learn: 0.0115101	total: 23.2s	remaining: 11.3s
192:	learn: 0.0114195	total: 23.3s	remaining: 11.1s
193:	learn: 0.0113455	total: 23.4s	remaining: 11s
194:	learn: 0.0113001	total: 23.5s	remaining: 10.8s
195:	learn: 0.0112491	total: 23.6s	remaining: 10.7s
196:	learn: 0.0111824	total: 23.7s	remaining: 10.6s
197:	learn: 0.0111246	total: 23.8s	remaining: 10.5s
198:	learn: 0.0110353	total: 23.9s	remaining: 10.3s
199:	learn: 0.0109686	total: 24s	remaining: 10.2s
200:	learn: 0.0109088	total: 24.1s	remaining: 10.1s
201:	learn: 0.0108679	total: 24.2s	remaining: 9.94s
202:	learn: 0.0107999	total: 24.3s	remaining: 9.8s
203:	learn: 0.0107234	total: 24.4s	remaining: 9.68s
204:	learn: 0.0106444	total: 24.5s	remaining: 9.55s
205:	learn: 0.0105852	total: 24.5s	remaining: 9.41s
206:	learn: 0.0105313	total: 24.7s	remaining: 9.29s
207:	learn: 0.0104601	total: 24.8s	remaining: 9.16s
208:	learn: 0.0103954	total: 24.8s	remaining: 9.03s
209:	learn: 0.0103565	total: 25s	remaining: 8.92s
210:	learn: 0.0102885	total: 25.1s	remaining: 8.79s
211:	learn: 0.0102562	total: 25.1s	remaining: 8.66s
212:	learn: 0.0101727	total: 25.2s	remaining: 8.53s
213:	learn: 0.0100961	total: 25.3s	remaining: 8.41s
214:	learn: 0.0100085	total: 25.4s	remaining: 8.28s
215:	learn: 0.0099331	total: 25.5s	remaining: 8.16s
216:	learn: 0.0098792	total: 25.6s	remaining: 8.03s
217:	learn: 0.0098120	total: 25.7s	remaining: 7.9s
218:	learn: 0.0097603	total: 25.8s	remaining: 7.78s
219:	learn: 0.0096862	total: 25.9s	remaining: 7.66s
220:	learn: 0.0096188	total: 26s	remaining: 7.53s
221:	learn: 0.0095428	total: 26.1s	remaining: 7.42s
222:	learn: 0.0094777	total: 26.2s	remaining: 7.29s
223:	learn: 0.0093974	total: 26.3s	remaining: 7.17s
224:	learn: 0.0093490	total: 26.4s	remaining: 7.04s
225:	learn: 0.0092858	total: 26.5s	remaining: 6.92s
226:	learn: 0.0092050	total: 26.6s	remaining: 6.79s
227:	learn: 0.0091255	total: 26.7s	remaining: 6.68s
228:	learn: 0.0090529	total: 26.8s	remaining: 6.55s
229:	learn: 0.0089781	total: 26.9s	remaining: 6.43s
230:	learn: 0.0089237	total: 27s	remaining: 6.31s
231:	learn: 0.0088860	total: 27.1s	remaining: 6.19s
232:	learn: 0.0088236	total: 27.2s	remaining: 6.06s
233:	learn: 0.0087709	total: 27.3s	remaining: 5.94s
234:	learn: 0.0087429	total: 27.3s	remaining: 5.82s
235:	learn: 0.0086935	total: 27.4s	remaining: 5.7s
236:	learn: 0.0086399	total: 27.5s	remaining: 5.57s
237:	learn: 0.0085592	total: 27.6s	remaining: 5.45s
238:	learn: 0.0085087	total: 27.7s	remaining: 5.33s
239:	learn: 0.0084447	total: 27.8s	remaining: 5.22s
240:	learn: 0.0083455	total: 27.9s	remaining: 5.1s
241:	learn: 0.0083033	total: 28s	remaining: 4.98s
242:	learn: 0.0082559	total: 28.1s	remaining: 4.86s
243:	learn: 0.0081915	total: 28.2s	remaining: 4.74s
244:	learn: 0.0081060	total: 28.3s	remaining: 4.62s
245:	learn: 0.0080452	total: 28.4s	remaining: 4.5s
246:	learn: 0.0079952	total: 28.5s	remaining: 4.38s
247:	learn: 0.0079346	total: 28.6s	remaining: 4.26s
248:	learn: 0.0078953	total: 28.7s	remaining: 4.15s
249:	learn: 0.0078431	total: 28.8s	remaining: 4.03s
250:	learn: 0.0078258	total: 28.9s	remaining: 3.91s
251:	learn: 0.0077581	total: 29s	remaining: 3.79s
252:	learn: 0.0077054	total: 29.1s	remaining: 3.68s
253:	learn: 0.0076594	total: 29.2s	remaining: 3.56s
254:	learn: 0.0076323	total: 29.3s	remaining: 3.44s
255:	learn: 0.0075891	total: 29.4s	remaining: 3.32s
256:	learn: 0.0075428	total: 29.4s	remaining: 3.21s
257:	learn: 0.0075022	total: 29.5s	remaining: 3.09s
258:	learn: 0.0074597	total: 29.6s	remaining: 2.98s
259:	learn: 0.0074170	total: 29.7s	remaining: 2.86s
260:	learn: 0.0073800	total: 29.8s	remaining: 2.74s
261:	learn: 0.0073469	total: 29.9s	remaining: 2.63s
262:	learn: 0.0072885	total: 30s	remaining: 2.51s
263:	learn: 0.0072434	total: 30.1s	remaining: 2.4s
264:	learn: 0.0072016	total: 30.2s	remaining: 2.28s
265:	learn: 0.0071778	total: 30.3s	remaining: 2.16s
266:	learn: 0.0071070	total: 30.5s	remaining: 2.05s
267:	learn: 0.0070654	total: 30.6s	remaining: 1.94s
268:	learn: 0.0070099	total: 30.8s	remaining: 1.83s
269:	learn: 0.0070021	total: 31s	remaining: 1.72s
270:	learn: 0.0069606	total: 31.1s	remaining: 1.61s
271:	learn: 0.0069251	total: 31.3s	remaining: 1.5s
272:	learn: 0.0069104	total: 31.5s	remaining: 1.38s
273:	learn: 0.0068509	total: 31.7s	remaining: 1.27s
274:	learn: 0.0068017	total: 31.8s	remaining: 1.16s
275:	learn: 0.0067851	total: 32s	remaining: 1.04s
276:	learn: 0.0067615	total: 32.2s	remaining: 929ms
277:	learn: 0.0067357	total: 32.3s	remaining: 814ms
278:	learn: 0.0067058	total: 32.5s	remaining: 699ms
279:	learn: 0.0066886	total: 32.7s	remaining: 583ms
280:	learn: 0.0066353	total: 32.8s	remaining: 467ms
281:	learn: 0.0065900	total: 33s	remaining: 351ms
282:	learn: 0.0065393	total: 33.2s	remaining: 235ms
283:	learn: 0.0065272	total: 33.3s	remaining: 117ms
284:	learn: 0.0065002	total: 33.5s	remaining: 0us
[I 2024-12-19 14:43:55,220] Trial 30 finished with value: 75.80834607680522 and parameters: {'learning_rate': 0.030836379878527897, 'max_depth': 6, 'n_estimators': 285, 'scale_pos_weight': 5.7536982118548305}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.33
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.66
 - F1-Score_Train: 99.66
 - Precision_Test: 16.98
 - Recall_Test: 87.30
 - AUPRC_Test: 76.71
 - Accuracy_Test: 99.26
 - F1-Score_Test: 28.42
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 285
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.75
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 75.8083

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6214611	total: 88.2ms	remaining: 24.8s
1:	learn: 0.5343131	total: 184ms	remaining: 25.8s
2:	learn: 0.4627989	total: 275ms	remaining: 25.6s
3:	learn: 0.4059747	total: 391ms	remaining: 27.1s
4:	learn: 0.3573342	total: 486ms	remaining: 26.9s
5:	learn: 0.3181761	total: 578ms	remaining: 26.6s
6:	learn: 0.2842328	total: 692ms	remaining: 27.2s
7:	learn: 0.2538960	total: 782ms	remaining: 26.8s
8:	learn: 0.2257316	total: 888ms	remaining: 26.9s
9:	learn: 0.2057097	total: 996ms	remaining: 27.1s
10:	learn: 0.1840341	total: 1.09s	remaining: 27s
11:	learn: 0.1686824	total: 1.19s	remaining: 26.8s
12:	learn: 0.1560626	total: 1.29s	remaining: 26.8s
13:	learn: 0.1430218	total: 1.4s	remaining: 26.8s
14:	learn: 0.1329066	total: 1.49s	remaining: 26.5s
15:	learn: 0.1245712	total: 1.6s	remaining: 26.7s
16:	learn: 0.1162885	total: 1.7s	remaining: 26.4s
17:	learn: 0.1093059	total: 1.79s	remaining: 26.2s
18:	learn: 0.1047963	total: 1.91s	remaining: 26.4s
19:	learn: 0.0996633	total: 2s	remaining: 26.2s
20:	learn: 0.0943390	total: 2.09s	remaining: 26s
21:	learn: 0.0903183	total: 2.21s	remaining: 26.1s
22:	learn: 0.0851686	total: 2.31s	remaining: 26s
23:	learn: 0.0808871	total: 2.39s	remaining: 25.7s
24:	learn: 0.0786874	total: 2.51s	remaining: 25.8s
25:	learn: 0.0758070	total: 2.6s	remaining: 25.6s
26:	learn: 0.0733443	total: 2.69s	remaining: 25.4s
27:	learn: 0.0713756	total: 2.79s	remaining: 25.3s
28:	learn: 0.0684720	total: 2.88s	remaining: 25.1s
29:	learn: 0.0660786	total: 2.98s	remaining: 25.1s
30:	learn: 0.0641969	total: 3.1s	remaining: 25.1s
31:	learn: 0.0622566	total: 3.18s	remaining: 24.9s
32:	learn: 0.0601698	total: 3.27s	remaining: 24.7s
33:	learn: 0.0583312	total: 3.38s	remaining: 24.7s
34:	learn: 0.0564443	total: 3.47s	remaining: 24.5s
35:	learn: 0.0545525	total: 3.56s	remaining: 24.3s
36:	learn: 0.0533093	total: 3.68s	remaining: 24.3s
37:	learn: 0.0517194	total: 3.77s	remaining: 24.2s
38:	learn: 0.0503322	total: 3.86s	remaining: 24.1s
39:	learn: 0.0488747	total: 3.99s	remaining: 24.1s
40:	learn: 0.0477039	total: 4.08s	remaining: 24s
41:	learn: 0.0466641	total: 4.16s	remaining: 23.8s
42:	learn: 0.0457958	total: 4.27s	remaining: 23.7s
43:	learn: 0.0448444	total: 4.37s	remaining: 23.6s
44:	learn: 0.0437322	total: 4.46s	remaining: 23.5s
45:	learn: 0.0430236	total: 4.57s	remaining: 23.4s
46:	learn: 0.0420479	total: 4.65s	remaining: 23.3s
47:	learn: 0.0415843	total: 4.74s	remaining: 23.1s
48:	learn: 0.0409794	total: 4.83s	remaining: 23s
49:	learn: 0.0403315	total: 4.92s	remaining: 22.8s
50:	learn: 0.0397250	total: 5.02s	remaining: 22.8s
51:	learn: 0.0391667	total: 5.14s	remaining: 22.7s
52:	learn: 0.0383800	total: 5.23s	remaining: 22.6s
53:	learn: 0.0376550	total: 5.31s	remaining: 22.4s
54:	learn: 0.0371006	total: 5.42s	remaining: 22.4s
55:	learn: 0.0365654	total: 5.51s	remaining: 22.2s
56:	learn: 0.0359273	total: 5.59s	remaining: 22.1s
57:	learn: 0.0355438	total: 5.7s	remaining: 22s
58:	learn: 0.0349236	total: 5.78s	remaining: 21.9s
59:	learn: 0.0343226	total: 5.87s	remaining: 21.7s
60:	learn: 0.0337314	total: 5.97s	remaining: 21.6s
61:	learn: 0.0333163	total: 6.08s	remaining: 21.6s
62:	learn: 0.0329059	total: 6.17s	remaining: 21.4s
63:	learn: 0.0322836	total: 6.27s	remaining: 21.4s
64:	learn: 0.0319049	total: 6.36s	remaining: 21.2s
65:	learn: 0.0313302	total: 6.45s	remaining: 21.1s
66:	learn: 0.0309542	total: 6.57s	remaining: 21.1s
67:	learn: 0.0305892	total: 6.66s	remaining: 20.9s
68:	learn: 0.0300275	total: 6.75s	remaining: 20.9s
69:	learn: 0.0296935	total: 6.86s	remaining: 20.8s
70:	learn: 0.0291883	total: 6.95s	remaining: 20.7s
71:	learn: 0.0287739	total: 7.05s	remaining: 20.6s
72:	learn: 0.0283391	total: 7.16s	remaining: 20.5s
73:	learn: 0.0279710	total: 7.24s	remaining: 20.3s
74:	learn: 0.0276886	total: 7.32s	remaining: 20.2s
75:	learn: 0.0272366	total: 7.43s	remaining: 20.1s
76:	learn: 0.0268508	total: 7.52s	remaining: 20s
77:	learn: 0.0265522	total: 7.61s	remaining: 19.9s
78:	learn: 0.0262204	total: 7.71s	remaining: 19.8s
79:	learn: 0.0259181	total: 7.8s	remaining: 19.7s
80:	learn: 0.0256817	total: 7.88s	remaining: 19.6s
81:	learn: 0.0252941	total: 7.99s	remaining: 19.5s
82:	learn: 0.0249107	total: 8.1s	remaining: 19.4s
83:	learn: 0.0246151	total: 8.18s	remaining: 19.3s
84:	learn: 0.0242529	total: 8.28s	remaining: 19.2s
85:	learn: 0.0239464	total: 8.38s	remaining: 19.1s
86:	learn: 0.0237913	total: 8.46s	remaining: 19s
87:	learn: 0.0235193	total: 8.56s	remaining: 18.9s
88:	learn: 0.0232419	total: 8.65s	remaining: 18.8s
89:	learn: 0.0229488	total: 8.74s	remaining: 18.6s
90:	learn: 0.0227463	total: 8.85s	remaining: 18.6s
91:	learn: 0.0225733	total: 8.94s	remaining: 18.5s
92:	learn: 0.0223221	total: 9.07s	remaining: 18.4s
93:	learn: 0.0220661	total: 9.22s	remaining: 18.4s
94:	learn: 0.0218550	total: 9.38s	remaining: 18.5s
95:	learn: 0.0215544	total: 9.55s	remaining: 18.5s
96:	learn: 0.0212527	total: 9.74s	remaining: 18.6s
97:	learn: 0.0209404	total: 9.91s	remaining: 18.6s
98:	learn: 0.0207399	total: 10.1s	remaining: 18.6s
99:	learn: 0.0205137	total: 10.2s	remaining: 18.6s
100:	learn: 0.0203081	total: 10.4s	remaining: 18.7s
101:	learn: 0.0200656	total: 10.6s	remaining: 18.7s
102:	learn: 0.0198679	total: 10.8s	remaining: 18.7s
103:	learn: 0.0195184	total: 10.9s	remaining: 18.7s
104:	learn: 0.0193515	total: 11.1s	remaining: 18.8s
105:	learn: 0.0190830	total: 11.3s	remaining: 18.8s
106:	learn: 0.0188127	total: 11.5s	remaining: 18.8s
107:	learn: 0.0185858	total: 11.7s	remaining: 18.8s
108:	learn: 0.0184028	total: 11.9s	remaining: 18.8s
109:	learn: 0.0181969	total: 12s	remaining: 18.8s
110:	learn: 0.0180529	total: 12.2s	remaining: 18.8s
111:	learn: 0.0179055	total: 12.4s	remaining: 18.8s
112:	learn: 0.0177897	total: 12.6s	remaining: 18.8s
113:	learn: 0.0176502	total: 12.7s	remaining: 18.7s
114:	learn: 0.0174451	total: 12.9s	remaining: 18.7s
115:	learn: 0.0172856	total: 13.1s	remaining: 18.7s
116:	learn: 0.0171557	total: 13.2s	remaining: 18.6s
117:	learn: 0.0169708	total: 13.4s	remaining: 18.6s
118:	learn: 0.0168068	total: 13.6s	remaining: 18.6s
119:	learn: 0.0166318	total: 13.7s	remaining: 18.5s
120:	learn: 0.0164025	total: 13.9s	remaining: 18.5s
121:	learn: 0.0161495	total: 14.1s	remaining: 18.5s
122:	learn: 0.0159530	total: 14.3s	remaining: 18.5s
123:	learn: 0.0158281	total: 14.5s	remaining: 18.4s
124:	learn: 0.0156219	total: 14.6s	remaining: 18.4s
125:	learn: 0.0154537	total: 14.8s	remaining: 18.3s
126:	learn: 0.0152727	total: 14.9s	remaining: 18.1s
127:	learn: 0.0151519	total: 15s	remaining: 18s
128:	learn: 0.0149510	total: 15.1s	remaining: 17.9s
129:	learn: 0.0148371	total: 15.2s	remaining: 17.7s
130:	learn: 0.0147167	total: 15.2s	remaining: 17.6s
131:	learn: 0.0145791	total: 15.3s	remaining: 17.4s
132:	learn: 0.0144443	total: 15.5s	remaining: 17.3s
133:	learn: 0.0143302	total: 15.5s	remaining: 17.2s
134:	learn: 0.0141905	total: 15.7s	remaining: 17s
135:	learn: 0.0140667	total: 15.8s	remaining: 16.9s
136:	learn: 0.0139800	total: 15.8s	remaining: 16.8s
137:	learn: 0.0138782	total: 15.9s	remaining: 16.6s
138:	learn: 0.0137822	total: 16.1s	remaining: 16.5s
139:	learn: 0.0136541	total: 16.1s	remaining: 16.4s
140:	learn: 0.0135470	total: 16.2s	remaining: 16.2s
141:	learn: 0.0134088	total: 16.3s	remaining: 16.1s
142:	learn: 0.0133474	total: 16.4s	remaining: 15.9s
143:	learn: 0.0132459	total: 16.5s	remaining: 15.8s
144:	learn: 0.0131757	total: 16.6s	remaining: 15.7s
145:	learn: 0.0130045	total: 16.7s	remaining: 15.6s
146:	learn: 0.0129063	total: 16.8s	remaining: 15.4s
147:	learn: 0.0127847	total: 16.9s	remaining: 15.3s
148:	learn: 0.0126471	total: 17s	remaining: 15.2s
149:	learn: 0.0125516	total: 17.1s	remaining: 15s
150:	learn: 0.0124373	total: 17.2s	remaining: 14.9s
151:	learn: 0.0123087	total: 17.3s	remaining: 14.8s
152:	learn: 0.0122431	total: 17.3s	remaining: 14.6s
153:	learn: 0.0121167	total: 17.4s	remaining: 14.5s
154:	learn: 0.0119933	total: 17.6s	remaining: 14.4s
155:	learn: 0.0119211	total: 17.6s	remaining: 14.2s
156:	learn: 0.0118273	total: 17.7s	remaining: 14.1s
157:	learn: 0.0117232	total: 17.8s	remaining: 14s
158:	learn: 0.0115916	total: 17.9s	remaining: 13.9s
159:	learn: 0.0115161	total: 18s	remaining: 13.8s
160:	learn: 0.0114376	total: 18.1s	remaining: 13.6s
161:	learn: 0.0113545	total: 18.2s	remaining: 13.5s
162:	learn: 0.0112873	total: 18.3s	remaining: 13.4s
163:	learn: 0.0112324	total: 18.4s	remaining: 13.2s
164:	learn: 0.0111335	total: 18.5s	remaining: 13.1s
165:	learn: 0.0110111	total: 18.6s	remaining: 13s
166:	learn: 0.0109303	total: 18.7s	remaining: 12.9s
167:	learn: 0.0108149	total: 18.8s	remaining: 12.8s
168:	learn: 0.0107274	total: 18.9s	remaining: 12.6s
169:	learn: 0.0106398	total: 19s	remaining: 12.5s
170:	learn: 0.0105894	total: 19.1s	remaining: 12.4s
171:	learn: 0.0105361	total: 19.2s	remaining: 12.3s
172:	learn: 0.0104025	total: 19.3s	remaining: 12.1s
173:	learn: 0.0103344	total: 19.4s	remaining: 12s
174:	learn: 0.0102085	total: 19.5s	remaining: 11.9s
175:	learn: 0.0101545	total: 19.6s	remaining: 11.8s
176:	learn: 0.0100895	total: 19.7s	remaining: 11.7s
177:	learn: 0.0100316	total: 19.8s	remaining: 11.6s
178:	learn: 0.0099384	total: 19.9s	remaining: 11.4s
179:	learn: 0.0098771	total: 19.9s	remaining: 11.3s
180:	learn: 0.0098052	total: 20s	remaining: 11.2s
181:	learn: 0.0096789	total: 20.1s	remaining: 11.1s
182:	learn: 0.0096176	total: 20.2s	remaining: 10.9s
183:	learn: 0.0095363	total: 20.3s	remaining: 10.8s
184:	learn: 0.0094897	total: 20.4s	remaining: 10.7s
185:	learn: 0.0094127	total: 20.5s	remaining: 10.6s
186:	learn: 0.0093783	total: 20.6s	remaining: 10.5s
187:	learn: 0.0093219	total: 20.7s	remaining: 10.4s
188:	learn: 0.0092875	total: 20.8s	remaining: 10.2s
189:	learn: 0.0092298	total: 20.9s	remaining: 10.1s
190:	learn: 0.0091217	total: 21s	remaining: 10s
191:	learn: 0.0090790	total: 21.1s	remaining: 9.88s
192:	learn: 0.0089987	total: 21.2s	remaining: 9.77s
193:	learn: 0.0089303	total: 21.3s	remaining: 9.65s
194:	learn: 0.0088887	total: 21.4s	remaining: 9.53s
195:	learn: 0.0088000	total: 21.5s	remaining: 9.42s
196:	learn: 0.0087746	total: 21.6s	remaining: 9.3s
197:	learn: 0.0086985	total: 21.7s	remaining: 9.2s
198:	learn: 0.0086712	total: 21.8s	remaining: 9.09s
199:	learn: 0.0086215	total: 21.9s	remaining: 8.98s
200:	learn: 0.0085729	total: 22s	remaining: 8.87s
201:	learn: 0.0085213	total: 22.1s	remaining: 8.75s
202:	learn: 0.0084545	total: 22.2s	remaining: 8.64s
203:	learn: 0.0084188	total: 22.3s	remaining: 8.53s
204:	learn: 0.0083554	total: 22.4s	remaining: 8.41s
205:	learn: 0.0083124	total: 22.5s	remaining: 8.3s
206:	learn: 0.0082681	total: 22.6s	remaining: 8.18s
207:	learn: 0.0082073	total: 22.7s	remaining: 8.07s
208:	learn: 0.0081661	total: 22.8s	remaining: 7.96s
209:	learn: 0.0081250	total: 22.9s	remaining: 7.84s
210:	learn: 0.0080517	total: 23s	remaining: 7.73s
211:	learn: 0.0079556	total: 23.1s	remaining: 7.62s
212:	learn: 0.0078973	total: 23.2s	remaining: 7.51s
213:	learn: 0.0078179	total: 23.3s	remaining: 7.39s
214:	learn: 0.0077852	total: 23.4s	remaining: 7.28s
215:	learn: 0.0077321	total: 23.5s	remaining: 7.17s
216:	learn: 0.0076798	total: 23.5s	remaining: 7.05s
217:	learn: 0.0076204	total: 23.6s	remaining: 6.94s
218:	learn: 0.0075621	total: 23.7s	remaining: 6.83s
219:	learn: 0.0075176	total: 23.8s	remaining: 6.72s
220:	learn: 0.0074905	total: 23.9s	remaining: 6.61s
221:	learn: 0.0074447	total: 24s	remaining: 6.5s
222:	learn: 0.0073991	total: 24.1s	remaining: 6.38s
223:	learn: 0.0073664	total: 24.2s	remaining: 6.27s
224:	learn: 0.0073405	total: 24.3s	remaining: 6.16s
225:	learn: 0.0072722	total: 24.4s	remaining: 6.04s
226:	learn: 0.0072261	total: 24.5s	remaining: 5.93s
227:	learn: 0.0071831	total: 24.6s	remaining: 5.82s
228:	learn: 0.0071430	total: 24.7s	remaining: 5.71s
229:	learn: 0.0070444	total: 24.8s	remaining: 5.61s
230:	learn: 0.0070094	total: 25s	remaining: 5.51s
231:	learn: 0.0069591	total: 25.2s	remaining: 5.42s
232:	learn: 0.0068933	total: 25.3s	remaining: 5.33s
233:	learn: 0.0068451	total: 25.5s	remaining: 5.24s
234:	learn: 0.0068072	total: 25.7s	remaining: 5.14s
235:	learn: 0.0067406	total: 25.9s	remaining: 5.05s
236:	learn: 0.0067019	total: 26.1s	remaining: 4.96s
237:	learn: 0.0066545	total: 26.3s	remaining: 4.86s
238:	learn: 0.0066212	total: 26.4s	remaining: 4.75s
239:	learn: 0.0065727	total: 26.6s	remaining: 4.66s
240:	learn: 0.0065351	total: 26.8s	remaining: 4.56s
241:	learn: 0.0064871	total: 27s	remaining: 4.46s
242:	learn: 0.0064520	total: 27.1s	remaining: 4.35s
243:	learn: 0.0064153	total: 27.3s	remaining: 4.25s
244:	learn: 0.0063804	total: 27.5s	remaining: 4.15s
245:	learn: 0.0063479	total: 27.6s	remaining: 4.05s
246:	learn: 0.0063116	total: 27.8s	remaining: 3.94s
247:	learn: 0.0062649	total: 28.1s	remaining: 3.85s
248:	learn: 0.0062302	total: 28.2s	remaining: 3.74s
249:	learn: 0.0062031	total: 28.4s	remaining: 3.63s
250:	learn: 0.0061711	total: 28.5s	remaining: 3.52s
251:	learn: 0.0061496	total: 28.7s	remaining: 3.41s
252:	learn: 0.0061200	total: 28.8s	remaining: 3.3s
253:	learn: 0.0060859	total: 29s	remaining: 3.2s
254:	learn: 0.0060492	total: 29.2s	remaining: 3.09s
255:	learn: 0.0060158	total: 29.4s	remaining: 2.98s
256:	learn: 0.0059881	total: 29.6s	remaining: 2.88s
257:	learn: 0.0059578	total: 29.7s	remaining: 2.77s
258:	learn: 0.0059177	total: 29.9s	remaining: 2.65s
259:	learn: 0.0058985	total: 30.1s	remaining: 2.54s
260:	learn: 0.0058686	total: 30.2s	remaining: 2.43s
261:	learn: 0.0058497	total: 30.4s	remaining: 2.32s
262:	learn: 0.0058274	total: 30.5s	remaining: 2.2s
263:	learn: 0.0058128	total: 30.6s	remaining: 2.08s
264:	learn: 0.0058045	total: 30.7s	remaining: 1.97s
265:	learn: 0.0057410	total: 30.8s	remaining: 1.85s
266:	learn: 0.0056868	total: 30.9s	remaining: 1.73s
267:	learn: 0.0056476	total: 31s	remaining: 1.62s
268:	learn: 0.0056293	total: 31.1s	remaining: 1.5s
269:	learn: 0.0056098	total: 31.2s	remaining: 1.38s
270:	learn: 0.0055617	total: 31.3s	remaining: 1.27s
271:	learn: 0.0055231	total: 31.4s	remaining: 1.15s
272:	learn: 0.0054811	total: 31.4s	remaining: 1.04s
273:	learn: 0.0054669	total: 31.5s	remaining: 921ms
274:	learn: 0.0054474	total: 31.6s	remaining: 805ms
275:	learn: 0.0054360	total: 31.7s	remaining: 689ms
276:	learn: 0.0054184	total: 31.8s	remaining: 574ms
277:	learn: 0.0054019	total: 31.9s	remaining: 459ms
278:	learn: 0.0053777	total: 32s	remaining: 344ms
279:	learn: 0.0053328	total: 32.1s	remaining: 229ms
280:	learn: 0.0053193	total: 32.2s	remaining: 115ms
281:	learn: 0.0053061	total: 32.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.48
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.74
 - F1-Score_Train: 99.74
 - Precision_Test: 18.66
 - Recall_Test: 86.51
 - AUPRC_Test: 75.52
 - Accuracy_Test: 99.34
 - F1-Score_Test: 30.70
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 282
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6220895	total: 97.6ms	remaining: 27.4s
1:	learn: 0.5474713	total: 192ms	remaining: 26.8s
2:	learn: 0.4938332	total: 282ms	remaining: 26.2s
3:	learn: 0.4405410	total: 394ms	remaining: 27.4s
4:	learn: 0.3977845	total: 485ms	remaining: 26.8s
5:	learn: 0.3592920	total: 572ms	remaining: 26.3s
6:	learn: 0.3297195	total: 680ms	remaining: 26.7s
7:	learn: 0.2986728	total: 787ms	remaining: 26.9s
8:	learn: 0.2752179	total: 870ms	remaining: 26.4s
9:	learn: 0.2512153	total: 977ms	remaining: 26.6s
10:	learn: 0.2283511	total: 1.07s	remaining: 26.3s
11:	learn: 0.2134739	total: 1.16s	remaining: 26.2s
12:	learn: 0.1993130	total: 1.26s	remaining: 26.1s
13:	learn: 0.1845411	total: 1.36s	remaining: 26s
14:	learn: 0.1753945	total: 1.45s	remaining: 25.8s
15:	learn: 0.1650105	total: 1.56s	remaining: 25.9s
16:	learn: 0.1569495	total: 1.65s	remaining: 25.6s
17:	learn: 0.1482514	total: 1.74s	remaining: 25.5s
18:	learn: 0.1427709	total: 1.86s	remaining: 25.7s
19:	learn: 0.1366175	total: 1.94s	remaining: 25.5s
20:	learn: 0.1319000	total: 2.03s	remaining: 25.2s
21:	learn: 0.1273950	total: 2.15s	remaining: 25.4s
22:	learn: 0.1223837	total: 2.24s	remaining: 25.2s
23:	learn: 0.1169179	total: 2.33s	remaining: 25.1s
24:	learn: 0.1135728	total: 2.43s	remaining: 25s
25:	learn: 0.1098137	total: 2.53s	remaining: 24.9s
26:	learn: 0.1064233	total: 2.62s	remaining: 24.7s
27:	learn: 0.1028347	total: 2.72s	remaining: 24.7s
28:	learn: 0.0996502	total: 2.82s	remaining: 24.6s
29:	learn: 0.0958107	total: 2.92s	remaining: 24.5s
30:	learn: 0.0927285	total: 3.03s	remaining: 24.5s
31:	learn: 0.0900701	total: 3.11s	remaining: 24.3s
32:	learn: 0.0870412	total: 3.21s	remaining: 24.2s
33:	learn: 0.0845997	total: 3.32s	remaining: 24.2s
34:	learn: 0.0824026	total: 3.4s	remaining: 24s
35:	learn: 0.0800773	total: 3.51s	remaining: 24s
36:	learn: 0.0784409	total: 3.62s	remaining: 24s
37:	learn: 0.0767091	total: 3.72s	remaining: 23.9s
38:	learn: 0.0750138	total: 3.83s	remaining: 23.8s
39:	learn: 0.0738039	total: 3.93s	remaining: 23.8s
40:	learn: 0.0724187	total: 4.02s	remaining: 23.6s
41:	learn: 0.0708587	total: 4.11s	remaining: 23.5s
42:	learn: 0.0693583	total: 4.22s	remaining: 23.5s
43:	learn: 0.0678904	total: 4.31s	remaining: 23.3s
44:	learn: 0.0669004	total: 4.4s	remaining: 23.2s
45:	learn: 0.0654009	total: 4.51s	remaining: 23.2s
46:	learn: 0.0643473	total: 4.6s	remaining: 23s
47:	learn: 0.0632107	total: 4.69s	remaining: 22.9s
48:	learn: 0.0619919	total: 4.81s	remaining: 22.9s
49:	learn: 0.0607738	total: 4.92s	remaining: 22.8s
50:	learn: 0.0596265	total: 5.02s	remaining: 22.7s
51:	learn: 0.0588299	total: 5.11s	remaining: 22.6s
52:	learn: 0.0571599	total: 5.21s	remaining: 22.5s
53:	learn: 0.0562004	total: 5.32s	remaining: 22.4s
54:	learn: 0.0549996	total: 5.41s	remaining: 22.3s
55:	learn: 0.0541028	total: 5.51s	remaining: 22.2s
56:	learn: 0.0531977	total: 5.62s	remaining: 22.2s
57:	learn: 0.0521634	total: 5.71s	remaining: 22.1s
58:	learn: 0.0512082	total: 5.81s	remaining: 22s
59:	learn: 0.0503484	total: 5.94s	remaining: 22s
60:	learn: 0.0497577	total: 6.09s	remaining: 22.1s
61:	learn: 0.0490240	total: 6.26s	remaining: 22.2s
62:	learn: 0.0479619	total: 6.44s	remaining: 22.4s
63:	learn: 0.0471885	total: 6.62s	remaining: 22.6s
64:	learn: 0.0462831	total: 6.77s	remaining: 22.6s
65:	learn: 0.0456679	total: 6.96s	remaining: 22.8s
66:	learn: 0.0449901	total: 7.11s	remaining: 22.8s
67:	learn: 0.0442098	total: 7.31s	remaining: 23s
68:	learn: 0.0434056	total: 7.47s	remaining: 23.1s
69:	learn: 0.0428920	total: 7.67s	remaining: 23.2s
70:	learn: 0.0423806	total: 7.82s	remaining: 23.2s
71:	learn: 0.0419271	total: 8.01s	remaining: 23.3s
72:	learn: 0.0413203	total: 8.18s	remaining: 23.4s
73:	learn: 0.0408253	total: 8.33s	remaining: 23.4s
74:	learn: 0.0403249	total: 8.5s	remaining: 23.5s
75:	learn: 0.0399392	total: 8.7s	remaining: 23.6s
76:	learn: 0.0394901	total: 8.88s	remaining: 23.6s
77:	learn: 0.0389695	total: 9.08s	remaining: 23.7s
78:	learn: 0.0384589	total: 9.24s	remaining: 23.8s
79:	learn: 0.0381014	total: 9.42s	remaining: 23.8s
80:	learn: 0.0376211	total: 9.6s	remaining: 23.8s
81:	learn: 0.0370316	total: 9.77s	remaining: 23.8s
82:	learn: 0.0365394	total: 9.93s	remaining: 23.8s
83:	learn: 0.0358988	total: 10.1s	remaining: 23.9s
84:	learn: 0.0353961	total: 10.3s	remaining: 23.8s
85:	learn: 0.0349972	total: 10.5s	remaining: 23.9s
86:	learn: 0.0347339	total: 10.6s	remaining: 23.8s
87:	learn: 0.0343897	total: 10.8s	remaining: 23.8s
88:	learn: 0.0340817	total: 11s	remaining: 23.8s
89:	learn: 0.0338376	total: 11.2s	remaining: 23.8s
90:	learn: 0.0333963	total: 11.3s	remaining: 23.8s
91:	learn: 0.0331558	total: 11.5s	remaining: 23.8s
92:	learn: 0.0327949	total: 11.6s	remaining: 23.7s
93:	learn: 0.0324784	total: 11.7s	remaining: 23.5s
94:	learn: 0.0322812	total: 11.8s	remaining: 23.3s
95:	learn: 0.0318815	total: 11.9s	remaining: 23.1s
96:	learn: 0.0315331	total: 12s	remaining: 22.9s
97:	learn: 0.0311115	total: 12.1s	remaining: 22.8s
98:	learn: 0.0307771	total: 12.2s	remaining: 22.6s
99:	learn: 0.0303756	total: 12.3s	remaining: 22.4s
100:	learn: 0.0299973	total: 12.4s	remaining: 22.3s
101:	learn: 0.0297757	total: 12.5s	remaining: 22.1s
102:	learn: 0.0293637	total: 12.6s	remaining: 21.9s
103:	learn: 0.0290614	total: 12.7s	remaining: 21.8s
104:	learn: 0.0286589	total: 12.8s	remaining: 21.6s
105:	learn: 0.0284419	total: 12.9s	remaining: 21.4s
106:	learn: 0.0280969	total: 13s	remaining: 21.3s
107:	learn: 0.0277749	total: 13.1s	remaining: 21.1s
108:	learn: 0.0275990	total: 13.2s	remaining: 20.9s
109:	learn: 0.0273555	total: 13.3s	remaining: 20.8s
110:	learn: 0.0271492	total: 13.4s	remaining: 20.6s
111:	learn: 0.0269640	total: 13.5s	remaining: 20.4s
112:	learn: 0.0266746	total: 13.6s	remaining: 20.3s
113:	learn: 0.0263491	total: 13.7s	remaining: 20.1s
114:	learn: 0.0261676	total: 13.7s	remaining: 20s
115:	learn: 0.0259095	total: 13.8s	remaining: 19.8s
116:	learn: 0.0256676	total: 13.9s	remaining: 19.6s
117:	learn: 0.0253828	total: 14s	remaining: 19.5s
118:	learn: 0.0251595	total: 14.1s	remaining: 19.4s
119:	learn: 0.0249080	total: 14.2s	remaining: 19.2s
120:	learn: 0.0247398	total: 14.3s	remaining: 19s
121:	learn: 0.0244674	total: 14.4s	remaining: 18.9s
122:	learn: 0.0242653	total: 14.5s	remaining: 18.8s
123:	learn: 0.0239682	total: 14.6s	remaining: 18.6s
124:	learn: 0.0237551	total: 14.7s	remaining: 18.5s
125:	learn: 0.0235613	total: 14.8s	remaining: 18.3s
126:	learn: 0.0233122	total: 14.9s	remaining: 18.2s
127:	learn: 0.0230726	total: 15s	remaining: 18s
128:	learn: 0.0227783	total: 15.1s	remaining: 17.9s
129:	learn: 0.0225897	total: 15.2s	remaining: 17.7s
130:	learn: 0.0223854	total: 15.3s	remaining: 17.6s
131:	learn: 0.0222535	total: 15.4s	remaining: 17.5s
132:	learn: 0.0220622	total: 15.5s	remaining: 17.4s
133:	learn: 0.0219413	total: 15.6s	remaining: 17.2s
134:	learn: 0.0217541	total: 15.7s	remaining: 17.1s
135:	learn: 0.0216344	total: 15.8s	remaining: 16.9s
136:	learn: 0.0215500	total: 15.9s	remaining: 16.8s
137:	learn: 0.0212872	total: 16s	remaining: 16.7s
138:	learn: 0.0211605	total: 16s	remaining: 16.5s
139:	learn: 0.0209404	total: 16.2s	remaining: 16.4s
140:	learn: 0.0207767	total: 16.2s	remaining: 16.2s
141:	learn: 0.0205767	total: 16.3s	remaining: 16.1s
142:	learn: 0.0204329	total: 16.5s	remaining: 16s
143:	learn: 0.0201459	total: 16.5s	remaining: 15.9s
144:	learn: 0.0200187	total: 16.6s	remaining: 15.7s
145:	learn: 0.0197803	total: 16.7s	remaining: 15.6s
146:	learn: 0.0195832	total: 16.8s	remaining: 15.5s
147:	learn: 0.0193872	total: 16.9s	remaining: 15.3s
148:	learn: 0.0192972	total: 17s	remaining: 15.2s
149:	learn: 0.0191727	total: 17.1s	remaining: 15.1s
150:	learn: 0.0190166	total: 17.2s	remaining: 14.9s
151:	learn: 0.0188783	total: 17.3s	remaining: 14.8s
152:	learn: 0.0187401	total: 17.4s	remaining: 14.7s
153:	learn: 0.0185163	total: 17.5s	remaining: 14.5s
154:	learn: 0.0183193	total: 17.6s	remaining: 14.4s
155:	learn: 0.0181998	total: 17.7s	remaining: 14.3s
156:	learn: 0.0180572	total: 17.8s	remaining: 14.2s
157:	learn: 0.0179064	total: 17.9s	remaining: 14.1s
158:	learn: 0.0177425	total: 18s	remaining: 13.9s
159:	learn: 0.0176426	total: 18.1s	remaining: 13.8s
160:	learn: 0.0175398	total: 18.2s	remaining: 13.7s
161:	learn: 0.0174036	total: 18.3s	remaining: 13.5s
162:	learn: 0.0172694	total: 18.4s	remaining: 13.4s
163:	learn: 0.0170459	total: 18.5s	remaining: 13.3s
164:	learn: 0.0168576	total: 18.6s	remaining: 13.2s
165:	learn: 0.0167249	total: 18.7s	remaining: 13.1s
166:	learn: 0.0166140	total: 18.8s	remaining: 12.9s
167:	learn: 0.0164803	total: 18.9s	remaining: 12.8s
168:	learn: 0.0163780	total: 19s	remaining: 12.7s
169:	learn: 0.0162904	total: 19.1s	remaining: 12.6s
170:	learn: 0.0161531	total: 19.1s	remaining: 12.4s
171:	learn: 0.0160394	total: 19.2s	remaining: 12.3s
172:	learn: 0.0159252	total: 19.3s	remaining: 12.2s
173:	learn: 0.0158201	total: 19.4s	remaining: 12.1s
174:	learn: 0.0157165	total: 19.5s	remaining: 12s
175:	learn: 0.0156030	total: 19.6s	remaining: 11.8s
176:	learn: 0.0155130	total: 19.7s	remaining: 11.7s
177:	learn: 0.0154661	total: 19.8s	remaining: 11.6s
178:	learn: 0.0153850	total: 19.9s	remaining: 11.5s
179:	learn: 0.0152831	total: 20s	remaining: 11.3s
180:	learn: 0.0151905	total: 20.1s	remaining: 11.2s
181:	learn: 0.0150423	total: 20.2s	remaining: 11.1s
182:	learn: 0.0149323	total: 20.3s	remaining: 11s
183:	learn: 0.0148825	total: 20.4s	remaining: 10.8s
184:	learn: 0.0147161	total: 20.5s	remaining: 10.7s
185:	learn: 0.0146015	total: 20.6s	remaining: 10.6s
186:	learn: 0.0144379	total: 20.7s	remaining: 10.5s
187:	learn: 0.0143363	total: 20.8s	remaining: 10.4s
188:	learn: 0.0142600	total: 20.8s	remaining: 10.3s
189:	learn: 0.0141059	total: 21s	remaining: 10.1s
190:	learn: 0.0140395	total: 21s	remaining: 10s
191:	learn: 0.0139460	total: 21.1s	remaining: 9.9s
192:	learn: 0.0137809	total: 21.2s	remaining: 9.79s
193:	learn: 0.0136908	total: 21.3s	remaining: 9.68s
194:	learn: 0.0135831	total: 21.4s	remaining: 9.56s
195:	learn: 0.0135401	total: 21.5s	remaining: 9.45s
196:	learn: 0.0134575	total: 21.7s	remaining: 9.35s
197:	learn: 0.0133951	total: 21.8s	remaining: 9.27s
198:	learn: 0.0133328	total: 22s	remaining: 9.18s
199:	learn: 0.0132317	total: 22.2s	remaining: 9.09s
200:	learn: 0.0131450	total: 22.4s	remaining: 9.01s
201:	learn: 0.0130857	total: 22.5s	remaining: 8.93s
202:	learn: 0.0130462	total: 22.7s	remaining: 8.83s
203:	learn: 0.0129458	total: 22.9s	remaining: 8.74s
204:	learn: 0.0128816	total: 23s	remaining: 8.65s
205:	learn: 0.0127553	total: 23.2s	remaining: 8.56s
206:	learn: 0.0126823	total: 23.4s	remaining: 8.46s
207:	learn: 0.0125772	total: 23.5s	remaining: 8.37s
208:	learn: 0.0124909	total: 23.7s	remaining: 8.29s
209:	learn: 0.0124288	total: 23.9s	remaining: 8.2s
210:	learn: 0.0123878	total: 24.1s	remaining: 8.1s
211:	learn: 0.0122905	total: 24.3s	remaining: 8.01s
212:	learn: 0.0122134	total: 24.4s	remaining: 7.91s
213:	learn: 0.0121367	total: 24.6s	remaining: 7.82s
214:	learn: 0.0120705	total: 24.8s	remaining: 7.72s
215:	learn: 0.0120240	total: 25s	remaining: 7.62s
216:	learn: 0.0119021	total: 25.1s	remaining: 7.53s
217:	learn: 0.0117754	total: 25.3s	remaining: 7.43s
218:	learn: 0.0117036	total: 25.5s	remaining: 7.33s
219:	learn: 0.0115872	total: 25.7s	remaining: 7.23s
220:	learn: 0.0115266	total: 25.8s	remaining: 7.13s
221:	learn: 0.0114202	total: 26s	remaining: 7.03s
222:	learn: 0.0113981	total: 26.2s	remaining: 6.92s
223:	learn: 0.0112985	total: 26.4s	remaining: 6.82s
224:	learn: 0.0112489	total: 26.5s	remaining: 6.71s
225:	learn: 0.0111739	total: 26.7s	remaining: 6.62s
226:	learn: 0.0110701	total: 26.9s	remaining: 6.51s
227:	learn: 0.0110215	total: 27.1s	remaining: 6.41s
228:	learn: 0.0109617	total: 27.2s	remaining: 6.3s
229:	learn: 0.0109094	total: 27.3s	remaining: 6.18s
230:	learn: 0.0108085	total: 27.4s	remaining: 6.05s
231:	learn: 0.0107467	total: 27.5s	remaining: 5.93s
232:	learn: 0.0106597	total: 27.6s	remaining: 5.81s
233:	learn: 0.0105799	total: 27.7s	remaining: 5.68s
234:	learn: 0.0105086	total: 27.8s	remaining: 5.56s
235:	learn: 0.0104758	total: 27.9s	remaining: 5.44s
236:	learn: 0.0104025	total: 28s	remaining: 5.32s
237:	learn: 0.0103823	total: 28.1s	remaining: 5.19s
238:	learn: 0.0103031	total: 28.2s	remaining: 5.07s
239:	learn: 0.0102380	total: 28.3s	remaining: 4.95s
240:	learn: 0.0101557	total: 28.4s	remaining: 4.82s
241:	learn: 0.0101290	total: 28.5s	remaining: 4.7s
242:	learn: 0.0100828	total: 28.5s	remaining: 4.58s
243:	learn: 0.0099925	total: 28.6s	remaining: 4.46s
244:	learn: 0.0099430	total: 28.7s	remaining: 4.34s
245:	learn: 0.0098857	total: 28.8s	remaining: 4.22s
246:	learn: 0.0098272	total: 28.9s	remaining: 4.1s
247:	learn: 0.0097578	total: 29s	remaining: 3.98s
248:	learn: 0.0096881	total: 29.1s	remaining: 3.86s
249:	learn: 0.0096360	total: 29.2s	remaining: 3.74s
250:	learn: 0.0096196	total: 29.3s	remaining: 3.62s
251:	learn: 0.0096065	total: 29.4s	remaining: 3.5s
252:	learn: 0.0095893	total: 29.5s	remaining: 3.38s
253:	learn: 0.0095521	total: 29.6s	remaining: 3.26s
254:	learn: 0.0095251	total: 29.7s	remaining: 3.14s
255:	learn: 0.0094786	total: 29.8s	remaining: 3.02s
256:	learn: 0.0094193	total: 29.9s	remaining: 2.91s
257:	learn: 0.0093840	total: 30s	remaining: 2.79s
258:	learn: 0.0093703	total: 30s	remaining: 2.67s
259:	learn: 0.0093048	total: 30.2s	remaining: 2.55s
260:	learn: 0.0092415	total: 30.2s	remaining: 2.43s
261:	learn: 0.0091965	total: 30.3s	remaining: 2.31s
262:	learn: 0.0091159	total: 30.4s	remaining: 2.2s
263:	learn: 0.0090505	total: 30.5s	remaining: 2.08s
264:	learn: 0.0089841	total: 30.6s	remaining: 1.96s
265:	learn: 0.0089465	total: 30.7s	remaining: 1.85s
266:	learn: 0.0088835	total: 30.8s	remaining: 1.73s
267:	learn: 0.0088246	total: 30.9s	remaining: 1.61s
268:	learn: 0.0087796	total: 31s	remaining: 1.5s
269:	learn: 0.0087279	total: 31.1s	remaining: 1.38s
270:	learn: 0.0086574	total: 31.2s	remaining: 1.26s
271:	learn: 0.0086044	total: 31.3s	remaining: 1.15s
272:	learn: 0.0085581	total: 31.4s	remaining: 1.03s
273:	learn: 0.0084968	total: 31.5s	remaining: 919ms
274:	learn: 0.0084378	total: 31.6s	remaining: 804ms
275:	learn: 0.0084176	total: 31.7s	remaining: 688ms
276:	learn: 0.0083739	total: 31.7s	remaining: 573ms
277:	learn: 0.0083512	total: 31.8s	remaining: 458ms
278:	learn: 0.0083215	total: 32s	remaining: 344ms
279:	learn: 0.0082857	total: 32s	remaining: 229ms
280:	learn: 0.0082397	total: 32.1s	remaining: 114ms
281:	learn: 0.0081828	total: 32.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.24
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.62
 - F1-Score_Train: 99.62
 - Precision_Test: 16.05
 - Recall_Test: 88.89
 - AUPRC_Test: 75.15
 - Accuracy_Test: 99.20
 - F1-Score_Test: 27.18
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 282
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6225581	total: 88.2ms	remaining: 24.8s
1:	learn: 0.5486515	total: 177ms	remaining: 24.7s
2:	learn: 0.4789358	total: 265ms	remaining: 24.7s
3:	learn: 0.4332096	total: 376ms	remaining: 26.1s
4:	learn: 0.4000851	total: 456ms	remaining: 25.3s
5:	learn: 0.3606694	total: 539ms	remaining: 24.8s
6:	learn: 0.3179819	total: 682ms	remaining: 26.8s
7:	learn: 0.2953867	total: 761ms	remaining: 26.1s
8:	learn: 0.2632512	total: 848ms	remaining: 25.7s
9:	learn: 0.2421221	total: 954ms	remaining: 26s
10:	learn: 0.2241515	total: 1.03s	remaining: 25.5s
11:	learn: 0.2079922	total: 1.12s	remaining: 25.1s
12:	learn: 0.1933043	total: 1.22s	remaining: 25.3s
13:	learn: 0.1802816	total: 1.31s	remaining: 25s
14:	learn: 0.1707389	total: 1.39s	remaining: 24.7s
15:	learn: 0.1597097	total: 1.49s	remaining: 24.8s
16:	learn: 0.1517848	total: 1.57s	remaining: 24.4s
17:	learn: 0.1433584	total: 1.67s	remaining: 24.5s
18:	learn: 0.1365181	total: 1.77s	remaining: 24.6s
19:	learn: 0.1305044	total: 1.85s	remaining: 24.3s
20:	learn: 0.1228916	total: 1.96s	remaining: 24.3s
21:	learn: 0.1164179	total: 2.06s	remaining: 24.4s
22:	learn: 0.1111148	total: 2.15s	remaining: 24.2s
23:	learn: 0.1074146	total: 2.23s	remaining: 23.9s
24:	learn: 0.1023324	total: 2.34s	remaining: 24s
25:	learn: 0.0985522	total: 2.42s	remaining: 23.8s
26:	learn: 0.0952985	total: 2.51s	remaining: 23.7s
27:	learn: 0.0915958	total: 2.61s	remaining: 23.7s
28:	learn: 0.0887177	total: 2.72s	remaining: 23.7s
29:	learn: 0.0862464	total: 2.83s	remaining: 23.8s
30:	learn: 0.0828360	total: 3s	remaining: 24.3s
31:	learn: 0.0798782	total: 3.18s	remaining: 24.8s
32:	learn: 0.0767215	total: 3.34s	remaining: 25.2s
33:	learn: 0.0746030	total: 3.51s	remaining: 25.6s
34:	learn: 0.0729339	total: 3.67s	remaining: 25.9s
35:	learn: 0.0705397	total: 3.84s	remaining: 26.3s
36:	learn: 0.0685253	total: 4.03s	remaining: 26.7s
37:	learn: 0.0667748	total: 4.22s	remaining: 27.1s
38:	learn: 0.0651564	total: 4.39s	remaining: 27.4s
39:	learn: 0.0633103	total: 4.57s	remaining: 27.6s
40:	learn: 0.0614139	total: 4.73s	remaining: 27.8s
41:	learn: 0.0597273	total: 4.91s	remaining: 28.1s
42:	learn: 0.0583309	total: 5.08s	remaining: 28.3s
43:	learn: 0.0570092	total: 5.28s	remaining: 28.6s
44:	learn: 0.0560743	total: 5.43s	remaining: 28.6s
45:	learn: 0.0546225	total: 5.61s	remaining: 28.8s
46:	learn: 0.0538555	total: 5.8s	remaining: 29s
47:	learn: 0.0526317	total: 6s	remaining: 29.2s
48:	learn: 0.0517053	total: 6.17s	remaining: 29.4s
49:	learn: 0.0508342	total: 6.35s	remaining: 29.5s
50:	learn: 0.0501102	total: 6.53s	remaining: 29.6s
51:	learn: 0.0492689	total: 6.7s	remaining: 29.6s
52:	learn: 0.0484430	total: 6.87s	remaining: 29.7s
53:	learn: 0.0475089	total: 7.07s	remaining: 29.8s
54:	learn: 0.0468622	total: 7.23s	remaining: 29.8s
55:	learn: 0.0461218	total: 7.43s	remaining: 30s
56:	learn: 0.0453620	total: 7.61s	remaining: 30s
57:	learn: 0.0448555	total: 7.77s	remaining: 30s
58:	learn: 0.0439543	total: 7.95s	remaining: 30s
59:	learn: 0.0432889	total: 8.13s	remaining: 30.1s
60:	learn: 0.0427945	total: 8.3s	remaining: 30.1s
61:	learn: 0.0421967	total: 8.49s	remaining: 30.1s
62:	learn: 0.0415978	total: 8.61s	remaining: 29.9s
63:	learn: 0.0409491	total: 8.71s	remaining: 29.7s
64:	learn: 0.0404632	total: 8.81s	remaining: 29.4s
65:	learn: 0.0400274	total: 8.89s	remaining: 29.1s
66:	learn: 0.0394750	total: 9.01s	remaining: 28.9s
67:	learn: 0.0388818	total: 9.1s	remaining: 28.6s
68:	learn: 0.0384960	total: 9.19s	remaining: 28.4s
69:	learn: 0.0377508	total: 9.31s	remaining: 28.2s
70:	learn: 0.0371426	total: 9.4s	remaining: 27.9s
71:	learn: 0.0367498	total: 9.48s	remaining: 27.7s
72:	learn: 0.0363148	total: 9.6s	remaining: 27.5s
73:	learn: 0.0358988	total: 9.68s	remaining: 27.2s
74:	learn: 0.0355431	total: 9.77s	remaining: 27s
75:	learn: 0.0350921	total: 9.87s	remaining: 26.7s
76:	learn: 0.0344792	total: 9.96s	remaining: 26.5s
77:	learn: 0.0341861	total: 10s	remaining: 26.3s
78:	learn: 0.0336419	total: 10.2s	remaining: 26.1s
79:	learn: 0.0333450	total: 10.3s	remaining: 25.9s
80:	learn: 0.0329477	total: 10.3s	remaining: 25.7s
81:	learn: 0.0325090	total: 10.5s	remaining: 25.5s
82:	learn: 0.0320902	total: 10.5s	remaining: 25.3s
83:	learn: 0.0316352	total: 10.6s	remaining: 25.1s
84:	learn: 0.0312846	total: 10.7s	remaining: 24.9s
85:	learn: 0.0309951	total: 10.8s	remaining: 24.7s
86:	learn: 0.0306893	total: 10.9s	remaining: 24.4s
87:	learn: 0.0303694	total: 11s	remaining: 24.3s
88:	learn: 0.0300176	total: 11.1s	remaining: 24.1s
89:	learn: 0.0297016	total: 11.2s	remaining: 23.9s
90:	learn: 0.0292639	total: 11.3s	remaining: 23.8s
91:	learn: 0.0288529	total: 11.4s	remaining: 23.6s
92:	learn: 0.0286186	total: 11.5s	remaining: 23.4s
93:	learn: 0.0283078	total: 11.6s	remaining: 23.2s
94:	learn: 0.0279015	total: 11.7s	remaining: 23s
95:	learn: 0.0276353	total: 11.8s	remaining: 22.8s
96:	learn: 0.0274111	total: 11.9s	remaining: 22.7s
97:	learn: 0.0272070	total: 12s	remaining: 22.5s
98:	learn: 0.0270408	total: 12.1s	remaining: 22.3s
99:	learn: 0.0267953	total: 12.2s	remaining: 22.2s
100:	learn: 0.0265956	total: 12.3s	remaining: 22s
101:	learn: 0.0263910	total: 12.4s	remaining: 21.8s
102:	learn: 0.0261263	total: 12.5s	remaining: 21.7s
103:	learn: 0.0259717	total: 12.5s	remaining: 21.5s
104:	learn: 0.0256605	total: 12.6s	remaining: 21.3s
105:	learn: 0.0254630	total: 12.8s	remaining: 21.2s
106:	learn: 0.0251506	total: 12.9s	remaining: 21s
107:	learn: 0.0249810	total: 12.9s	remaining: 20.9s
108:	learn: 0.0246398	total: 13.1s	remaining: 20.7s
109:	learn: 0.0243768	total: 13.2s	remaining: 20.6s
110:	learn: 0.0241497	total: 13.3s	remaining: 20.4s
111:	learn: 0.0238819	total: 13.4s	remaining: 20.3s
112:	learn: 0.0236271	total: 13.5s	remaining: 20.1s
113:	learn: 0.0233487	total: 13.6s	remaining: 20s
114:	learn: 0.0231356	total: 13.7s	remaining: 19.8s
115:	learn: 0.0229533	total: 13.8s	remaining: 19.7s
116:	learn: 0.0227648	total: 13.8s	remaining: 19.5s
117:	learn: 0.0225534	total: 13.9s	remaining: 19.4s
118:	learn: 0.0223284	total: 14s	remaining: 19.2s
119:	learn: 0.0220229	total: 14.1s	remaining: 19.1s
120:	learn: 0.0219133	total: 14.2s	remaining: 18.9s
121:	learn: 0.0216306	total: 14.3s	remaining: 18.8s
122:	learn: 0.0214404	total: 14.4s	remaining: 18.6s
123:	learn: 0.0212632	total: 14.5s	remaining: 18.5s
124:	learn: 0.0210299	total: 14.6s	remaining: 18.4s
125:	learn: 0.0208414	total: 14.7s	remaining: 18.2s
126:	learn: 0.0206717	total: 14.8s	remaining: 18.1s
127:	learn: 0.0205333	total: 14.9s	remaining: 17.9s
128:	learn: 0.0203646	total: 15s	remaining: 17.8s
129:	learn: 0.0201692	total: 15.1s	remaining: 17.7s
130:	learn: 0.0199319	total: 15.2s	remaining: 17.5s
131:	learn: 0.0196178	total: 15.3s	remaining: 17.4s
132:	learn: 0.0194221	total: 15.4s	remaining: 17.3s
133:	learn: 0.0192722	total: 15.5s	remaining: 17.1s
134:	learn: 0.0191085	total: 15.6s	remaining: 17s
135:	learn: 0.0188785	total: 15.7s	remaining: 16.9s
136:	learn: 0.0187274	total: 15.8s	remaining: 16.7s
137:	learn: 0.0185079	total: 15.9s	remaining: 16.6s
138:	learn: 0.0183341	total: 16s	remaining: 16.5s
139:	learn: 0.0182105	total: 16.1s	remaining: 16.3s
140:	learn: 0.0179869	total: 16.2s	remaining: 16.2s
141:	learn: 0.0178114	total: 16.3s	remaining: 16.1s
142:	learn: 0.0175912	total: 16.4s	remaining: 16s
143:	learn: 0.0174347	total: 16.5s	remaining: 15.8s
144:	learn: 0.0172412	total: 16.6s	remaining: 15.7s
145:	learn: 0.0170590	total: 16.7s	remaining: 15.6s
146:	learn: 0.0168887	total: 16.8s	remaining: 15.4s
147:	learn: 0.0167484	total: 16.9s	remaining: 15.3s
148:	learn: 0.0166439	total: 17s	remaining: 15.2s
149:	learn: 0.0164593	total: 17.1s	remaining: 15.1s
150:	learn: 0.0162918	total: 17.2s	remaining: 14.9s
151:	learn: 0.0161968	total: 17.3s	remaining: 14.8s
152:	learn: 0.0161058	total: 17.4s	remaining: 14.7s
153:	learn: 0.0159826	total: 17.5s	remaining: 14.5s
154:	learn: 0.0158303	total: 17.6s	remaining: 14.4s
155:	learn: 0.0156707	total: 17.7s	remaining: 14.3s
156:	learn: 0.0156048	total: 17.8s	remaining: 14.2s
157:	learn: 0.0154482	total: 17.9s	remaining: 14s
158:	learn: 0.0153244	total: 18s	remaining: 13.9s
159:	learn: 0.0151770	total: 18.1s	remaining: 13.8s
160:	learn: 0.0150569	total: 18.2s	remaining: 13.7s
161:	learn: 0.0149833	total: 18.3s	remaining: 13.5s
162:	learn: 0.0148636	total: 18.3s	remaining: 13.4s
163:	learn: 0.0147066	total: 18.5s	remaining: 13.3s
164:	learn: 0.0145961	total: 18.6s	remaining: 13.2s
165:	learn: 0.0144200	total: 18.8s	remaining: 13.1s
166:	learn: 0.0143346	total: 18.9s	remaining: 13s
167:	learn: 0.0142545	total: 19.1s	remaining: 12.9s
168:	learn: 0.0141265	total: 19.2s	remaining: 12.9s
169:	learn: 0.0139886	total: 19.4s	remaining: 12.8s
170:	learn: 0.0138611	total: 19.6s	remaining: 12.7s
171:	learn: 0.0137223	total: 19.8s	remaining: 12.7s
172:	learn: 0.0136402	total: 20s	remaining: 12.6s
173:	learn: 0.0135388	total: 20.2s	remaining: 12.5s
174:	learn: 0.0134219	total: 20.3s	remaining: 12.4s
175:	learn: 0.0133144	total: 20.5s	remaining: 12.3s
176:	learn: 0.0132002	total: 20.7s	remaining: 12.3s
177:	learn: 0.0131637	total: 20.9s	remaining: 12.2s
178:	learn: 0.0130502	total: 21s	remaining: 12.1s
179:	learn: 0.0129408	total: 21.2s	remaining: 12s
180:	learn: 0.0128564	total: 21.4s	remaining: 12s
181:	learn: 0.0127523	total: 21.6s	remaining: 11.9s
182:	learn: 0.0126706	total: 21.8s	remaining: 11.8s
183:	learn: 0.0125616	total: 22s	remaining: 11.7s
184:	learn: 0.0125073	total: 22.2s	remaining: 11.6s
185:	learn: 0.0124611	total: 22.4s	remaining: 11.5s
186:	learn: 0.0123934	total: 22.5s	remaining: 11.4s
187:	learn: 0.0123010	total: 22.7s	remaining: 11.3s
188:	learn: 0.0121898	total: 22.9s	remaining: 11.3s
189:	learn: 0.0121204	total: 23.1s	remaining: 11.2s
190:	learn: 0.0120125	total: 23.2s	remaining: 11.1s
191:	learn: 0.0119212	total: 23.4s	remaining: 11s
192:	learn: 0.0118382	total: 23.6s	remaining: 10.9s
193:	learn: 0.0117317	total: 23.8s	remaining: 10.8s
194:	learn: 0.0116389	total: 24s	remaining: 10.7s
195:	learn: 0.0115740	total: 24.2s	remaining: 10.6s
196:	learn: 0.0115282	total: 24.3s	remaining: 10.5s
197:	learn: 0.0114464	total: 24.3s	remaining: 10.3s
198:	learn: 0.0113681	total: 24.4s	remaining: 10.2s
199:	learn: 0.0112797	total: 24.5s	remaining: 10.1s
200:	learn: 0.0112415	total: 24.6s	remaining: 9.92s
201:	learn: 0.0112037	total: 24.7s	remaining: 9.79s
202:	learn: 0.0110667	total: 24.8s	remaining: 9.66s
203:	learn: 0.0109887	total: 24.9s	remaining: 9.53s
204:	learn: 0.0108668	total: 25s	remaining: 9.4s
205:	learn: 0.0107839	total: 25.1s	remaining: 9.27s
206:	learn: 0.0107055	total: 25.2s	remaining: 9.13s
207:	learn: 0.0106332	total: 25.3s	remaining: 9.01s
208:	learn: 0.0105545	total: 25.4s	remaining: 8.88s
209:	learn: 0.0104894	total: 25.5s	remaining: 8.74s
210:	learn: 0.0104366	total: 25.6s	remaining: 8.62s
211:	learn: 0.0103568	total: 25.7s	remaining: 8.49s
212:	learn: 0.0103013	total: 25.8s	remaining: 8.36s
213:	learn: 0.0102636	total: 25.9s	remaining: 8.23s
214:	learn: 0.0101891	total: 26s	remaining: 8.1s
215:	learn: 0.0100869	total: 26.1s	remaining: 7.97s
216:	learn: 0.0100021	total: 26.2s	remaining: 7.85s
217:	learn: 0.0099358	total: 26.3s	remaining: 7.72s
218:	learn: 0.0098536	total: 26.4s	remaining: 7.59s
219:	learn: 0.0097813	total: 26.5s	remaining: 7.46s
220:	learn: 0.0096776	total: 26.6s	remaining: 7.33s
221:	learn: 0.0095956	total: 26.7s	remaining: 7.2s
222:	learn: 0.0095609	total: 26.8s	remaining: 7.08s
223:	learn: 0.0094623	total: 26.9s	remaining: 6.96s
224:	learn: 0.0093925	total: 27s	remaining: 6.83s
225:	learn: 0.0093503	total: 27.1s	remaining: 6.71s
226:	learn: 0.0092830	total: 27.2s	remaining: 6.58s
227:	learn: 0.0092416	total: 27.3s	remaining: 6.46s
228:	learn: 0.0091707	total: 27.4s	remaining: 6.33s
229:	learn: 0.0091267	total: 27.4s	remaining: 6.2s
230:	learn: 0.0090736	total: 27.5s	remaining: 6.08s
231:	learn: 0.0090015	total: 27.6s	remaining: 5.96s
232:	learn: 0.0089523	total: 27.7s	remaining: 5.83s
233:	learn: 0.0089092	total: 27.8s	remaining: 5.71s
234:	learn: 0.0088100	total: 27.9s	remaining: 5.59s
235:	learn: 0.0087468	total: 28s	remaining: 5.46s
236:	learn: 0.0087062	total: 28.1s	remaining: 5.34s
237:	learn: 0.0086666	total: 28.2s	remaining: 5.21s
238:	learn: 0.0085994	total: 28.3s	remaining: 5.09s
239:	learn: 0.0085500	total: 28.4s	remaining: 4.97s
240:	learn: 0.0085185	total: 28.5s	remaining: 4.84s
241:	learn: 0.0084504	total: 28.6s	remaining: 4.72s
242:	learn: 0.0084093	total: 28.7s	remaining: 4.6s
243:	learn: 0.0083531	total: 28.8s	remaining: 4.48s
244:	learn: 0.0082838	total: 28.9s	remaining: 4.36s
245:	learn: 0.0082664	total: 29s	remaining: 4.24s
246:	learn: 0.0082027	total: 29.1s	remaining: 4.12s
247:	learn: 0.0081581	total: 29.2s	remaining: 4s
248:	learn: 0.0081181	total: 29.3s	remaining: 3.88s
249:	learn: 0.0080917	total: 29.4s	remaining: 3.76s
250:	learn: 0.0080499	total: 29.4s	remaining: 3.63s
251:	learn: 0.0080397	total: 29.5s	remaining: 3.52s
252:	learn: 0.0079681	total: 29.6s	remaining: 3.4s
253:	learn: 0.0079480	total: 29.7s	remaining: 3.27s
254:	learn: 0.0078953	total: 29.8s	remaining: 3.15s
255:	learn: 0.0078212	total: 29.9s	remaining: 3.04s
256:	learn: 0.0077611	total: 30s	remaining: 2.92s
257:	learn: 0.0077377	total: 30.1s	remaining: 2.8s
258:	learn: 0.0076833	total: 30.2s	remaining: 2.68s
259:	learn: 0.0076235	total: 30.3s	remaining: 2.56s
260:	learn: 0.0075664	total: 30.4s	remaining: 2.45s
261:	learn: 0.0075417	total: 30.5s	remaining: 2.33s
262:	learn: 0.0074834	total: 30.6s	remaining: 2.21s
263:	learn: 0.0074288	total: 30.7s	remaining: 2.09s
264:	learn: 0.0073873	total: 30.8s	remaining: 1.97s
265:	learn: 0.0073682	total: 30.9s	remaining: 1.86s
266:	learn: 0.0073016	total: 31s	remaining: 1.74s
267:	learn: 0.0072482	total: 31.1s	remaining: 1.62s
268:	learn: 0.0072043	total: 31.2s	remaining: 1.51s
269:	learn: 0.0071984	total: 31.2s	remaining: 1.39s
270:	learn: 0.0071452	total: 31.3s	remaining: 1.27s
271:	learn: 0.0071321	total: 31.4s	remaining: 1.16s
272:	learn: 0.0071110	total: 31.5s	remaining: 1.04s
273:	learn: 0.0070797	total: 31.6s	remaining: 923ms
274:	learn: 0.0070304	total: 31.7s	remaining: 807ms
275:	learn: 0.0070145	total: 31.8s	remaining: 691ms
276:	learn: 0.0069588	total: 31.9s	remaining: 576ms
277:	learn: 0.0069135	total: 32s	remaining: 461ms
278:	learn: 0.0068737	total: 32.1s	remaining: 345ms
279:	learn: 0.0068592	total: 32.2s	remaining: 230ms
280:	learn: 0.0068321	total: 32.3s	remaining: 115ms
281:	learn: 0.0067925	total: 32.4s	remaining: 0us
[I 2024-12-19 14:45:39,283] Trial 31 finished with value: 75.63142654627858 and parameters: {'learning_rate': 0.029871745940680643, 'max_depth': 6, 'n_estimators': 282, 'scale_pos_weight': 5.742025260880375}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.29
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.64
 - F1-Score_Train: 99.65
 - Precision_Test: 16.11
 - Recall_Test: 88.10
 - AUPRC_Test: 76.22
 - Accuracy_Test: 99.21
 - F1-Score_Test: 27.24
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 282
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.74
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 75.6314

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6257779	total: 181ms	remaining: 47.7s
1:	learn: 0.5505313	total: 342ms	remaining: 45s
2:	learn: 0.4876470	total: 534ms	remaining: 46.6s
3:	learn: 0.4378464	total: 716ms	remaining: 46.7s
4:	learn: 0.3905146	total: 879ms	remaining: 45.7s
5:	learn: 0.3491902	total: 1.05s	remaining: 45.5s
6:	learn: 0.3120769	total: 1.25s	remaining: 45.9s
7:	learn: 0.2809215	total: 1.4s	remaining: 45.1s
8:	learn: 0.2543328	total: 1.59s	remaining: 45.3s
9:	learn: 0.2353882	total: 1.78s	remaining: 45.5s
10:	learn: 0.2148108	total: 1.95s	remaining: 45.1s
11:	learn: 0.1975294	total: 2.13s	remaining: 45s
12:	learn: 0.1827248	total: 2.35s	remaining: 45.5s
13:	learn: 0.1659316	total: 2.53s	remaining: 45.4s
14:	learn: 0.1539127	total: 2.73s	remaining: 45.5s
15:	learn: 0.1414927	total: 2.9s	remaining: 45.2s
16:	learn: 0.1305626	total: 3.09s	remaining: 45.1s
17:	learn: 0.1217682	total: 3.25s	remaining: 44.6s
18:	learn: 0.1153381	total: 3.42s	remaining: 44.3s
19:	learn: 0.1081350	total: 3.58s	remaining: 43.9s
20:	learn: 0.1028942	total: 3.75s	remaining: 43.6s
21:	learn: 0.0986919	total: 3.94s	remaining: 43.5s
22:	learn: 0.0930265	total: 4.13s	remaining: 43.5s
23:	learn: 0.0894380	total: 4.32s	remaining: 43.4s
24:	learn: 0.0849435	total: 4.51s	remaining: 43.4s
25:	learn: 0.0807882	total: 4.69s	remaining: 43.1s
26:	learn: 0.0769513	total: 4.87s	remaining: 43s
27:	learn: 0.0735667	total: 5.03s	remaining: 42.6s
28:	learn: 0.0710617	total: 5.14s	remaining: 41.8s
29:	learn: 0.0689574	total: 5.23s	remaining: 41s
30:	learn: 0.0667993	total: 5.32s	remaining: 40.2s
31:	learn: 0.0649879	total: 5.45s	remaining: 39.7s
32:	learn: 0.0628717	total: 5.55s	remaining: 39s
33:	learn: 0.0606776	total: 5.65s	remaining: 38.4s
34:	learn: 0.0587373	total: 5.76s	remaining: 37.9s
35:	learn: 0.0571013	total: 5.85s	remaining: 37.2s
36:	learn: 0.0556703	total: 5.94s	remaining: 36.6s
37:	learn: 0.0539118	total: 6.05s	remaining: 36.2s
38:	learn: 0.0523838	total: 6.15s	remaining: 35.6s
39:	learn: 0.0510664	total: 6.25s	remaining: 35.1s
40:	learn: 0.0497262	total: 6.36s	remaining: 34.8s
41:	learn: 0.0484553	total: 6.46s	remaining: 34.3s
42:	learn: 0.0473452	total: 6.55s	remaining: 33.8s
43:	learn: 0.0463640	total: 6.66s	remaining: 33.5s
44:	learn: 0.0455137	total: 6.75s	remaining: 33s
45:	learn: 0.0448265	total: 6.85s	remaining: 32.6s
46:	learn: 0.0440653	total: 6.96s	remaining: 32.3s
47:	learn: 0.0432288	total: 7.04s	remaining: 31.8s
48:	learn: 0.0424773	total: 7.12s	remaining: 31.4s
49:	learn: 0.0417524	total: 7.24s	remaining: 31.1s
50:	learn: 0.0412573	total: 7.32s	remaining: 30.7s
51:	learn: 0.0407998	total: 7.41s	remaining: 30.4s
52:	learn: 0.0403087	total: 7.54s	remaining: 30.1s
53:	learn: 0.0395179	total: 7.63s	remaining: 29.8s
54:	learn: 0.0388129	total: 7.72s	remaining: 29.5s
55:	learn: 0.0381925	total: 7.83s	remaining: 29.2s
56:	learn: 0.0376809	total: 7.92s	remaining: 28.9s
57:	learn: 0.0371223	total: 8.01s	remaining: 28.6s
58:	learn: 0.0366737	total: 8.12s	remaining: 28.3s
59:	learn: 0.0362991	total: 8.2s	remaining: 28s
60:	learn: 0.0358852	total: 8.29s	remaining: 27.7s
61:	learn: 0.0353588	total: 8.4s	remaining: 27.5s
62:	learn: 0.0348662	total: 8.5s	remaining: 27.3s
63:	learn: 0.0342745	total: 8.59s	remaining: 27s
64:	learn: 0.0337833	total: 8.7s	remaining: 26.8s
65:	learn: 0.0332726	total: 8.78s	remaining: 26.5s
66:	learn: 0.0327845	total: 8.87s	remaining: 26.2s
67:	learn: 0.0322749	total: 8.98s	remaining: 26s
68:	learn: 0.0319439	total: 9.07s	remaining: 25.8s
69:	learn: 0.0316126	total: 9.16s	remaining: 25.5s
70:	learn: 0.0313240	total: 9.29s	remaining: 25.4s
71:	learn: 0.0308325	total: 9.38s	remaining: 25.2s
72:	learn: 0.0304429	total: 9.47s	remaining: 24.9s
73:	learn: 0.0300716	total: 9.6s	remaining: 24.8s
74:	learn: 0.0296337	total: 9.69s	remaining: 24.6s
75:	learn: 0.0292429	total: 9.78s	remaining: 24.3s
76:	learn: 0.0288520	total: 9.9s	remaining: 24.2s
77:	learn: 0.0285535	total: 9.98s	remaining: 23.9s
78:	learn: 0.0282257	total: 10.1s	remaining: 23.7s
79:	learn: 0.0278625	total: 10.2s	remaining: 23.5s
80:	learn: 0.0275497	total: 10.3s	remaining: 23.3s
81:	learn: 0.0272931	total: 10.4s	remaining: 23.1s
82:	learn: 0.0269828	total: 10.5s	remaining: 22.9s
83:	learn: 0.0268134	total: 10.5s	remaining: 22.7s
84:	learn: 0.0264105	total: 10.6s	remaining: 22.5s
85:	learn: 0.0260876	total: 10.7s	remaining: 22.4s
86:	learn: 0.0257126	total: 10.9s	remaining: 22.2s
87:	learn: 0.0253378	total: 10.9s	remaining: 22s
88:	learn: 0.0251080	total: 11.1s	remaining: 21.9s
89:	learn: 0.0248563	total: 11.1s	remaining: 21.7s
90:	learn: 0.0245198	total: 11.2s	remaining: 21.5s
91:	learn: 0.0242522	total: 11.4s	remaining: 21.4s
92:	learn: 0.0240259	total: 11.4s	remaining: 21.2s
93:	learn: 0.0236939	total: 11.5s	remaining: 21s
94:	learn: 0.0235137	total: 11.6s	remaining: 20.8s
95:	learn: 0.0231680	total: 11.7s	remaining: 20.7s
96:	learn: 0.0228398	total: 11.8s	remaining: 20.5s
97:	learn: 0.0226144	total: 11.9s	remaining: 20.3s
98:	learn: 0.0223639	total: 12s	remaining: 20.2s
99:	learn: 0.0221472	total: 12.1s	remaining: 20s
100:	learn: 0.0219559	total: 12.2s	remaining: 19.8s
101:	learn: 0.0218135	total: 12.3s	remaining: 19.7s
102:	learn: 0.0216345	total: 12.4s	remaining: 19.5s
103:	learn: 0.0213588	total: 12.5s	remaining: 19.3s
104:	learn: 0.0210520	total: 12.6s	remaining: 19.2s
105:	learn: 0.0208872	total: 12.7s	remaining: 19s
106:	learn: 0.0206166	total: 12.8s	remaining: 18.9s
107:	learn: 0.0204046	total: 12.9s	remaining: 18.7s
108:	learn: 0.0202064	total: 13s	remaining: 18.6s
109:	learn: 0.0200395	total: 13.1s	remaining: 18.4s
110:	learn: 0.0198697	total: 13.2s	remaining: 18.3s
111:	learn: 0.0196353	total: 13.3s	remaining: 18.1s
112:	learn: 0.0194544	total: 13.4s	remaining: 18s
113:	learn: 0.0192577	total: 13.5s	remaining: 17.8s
114:	learn: 0.0191258	total: 13.5s	remaining: 17.7s
115:	learn: 0.0189496	total: 13.7s	remaining: 17.6s
116:	learn: 0.0187719	total: 13.8s	remaining: 17.4s
117:	learn: 0.0185665	total: 13.9s	remaining: 17.3s
118:	learn: 0.0183699	total: 14s	remaining: 17.1s
119:	learn: 0.0181398	total: 14.1s	remaining: 17s
120:	learn: 0.0179326	total: 14.2s	remaining: 16.9s
121:	learn: 0.0177560	total: 14.3s	remaining: 16.8s
122:	learn: 0.0176039	total: 14.4s	remaining: 16.6s
123:	learn: 0.0174675	total: 14.5s	remaining: 16.4s
124:	learn: 0.0172663	total: 14.6s	remaining: 16.3s
125:	learn: 0.0170892	total: 14.7s	remaining: 16.2s
126:	learn: 0.0169864	total: 14.8s	remaining: 16s
127:	learn: 0.0168329	total: 14.9s	remaining: 15.9s
128:	learn: 0.0166682	total: 15s	remaining: 15.8s
129:	learn: 0.0165178	total: 15.1s	remaining: 15.7s
130:	learn: 0.0163624	total: 15.3s	remaining: 15.7s
131:	learn: 0.0161321	total: 15.5s	remaining: 15.6s
132:	learn: 0.0159770	total: 15.7s	remaining: 15.6s
133:	learn: 0.0158768	total: 15.9s	remaining: 15.5s
134:	learn: 0.0157248	total: 16s	remaining: 15.4s
135:	learn: 0.0155874	total: 16.2s	remaining: 15.4s
136:	learn: 0.0154903	total: 16.4s	remaining: 15.3s
137:	learn: 0.0153582	total: 16.6s	remaining: 15.2s
138:	learn: 0.0152275	total: 16.7s	remaining: 15.2s
139:	learn: 0.0150768	total: 16.9s	remaining: 15.1s
140:	learn: 0.0149594	total: 17.1s	remaining: 15s
141:	learn: 0.0148375	total: 17.3s	remaining: 14.9s
142:	learn: 0.0147231	total: 17.4s	remaining: 14.9s
143:	learn: 0.0145999	total: 17.6s	remaining: 14.8s
144:	learn: 0.0144793	total: 17.8s	remaining: 14.7s
145:	learn: 0.0144206	total: 18s	remaining: 14.6s
146:	learn: 0.0142888	total: 18.1s	remaining: 14.6s
147:	learn: 0.0141530	total: 18.3s	remaining: 14.5s
148:	learn: 0.0140288	total: 18.5s	remaining: 14.4s
149:	learn: 0.0139528	total: 18.7s	remaining: 14.3s
150:	learn: 0.0138448	total: 18.8s	remaining: 14.2s
151:	learn: 0.0137700	total: 19s	remaining: 14.1s
152:	learn: 0.0136859	total: 19.2s	remaining: 14s
153:	learn: 0.0136014	total: 19.3s	remaining: 13.9s
154:	learn: 0.0135087	total: 19.5s	remaining: 13.9s
155:	learn: 0.0134094	total: 19.7s	remaining: 13.8s
156:	learn: 0.0132580	total: 19.9s	remaining: 13.7s
157:	learn: 0.0131556	total: 20.1s	remaining: 13.6s
158:	learn: 0.0130487	total: 20.2s	remaining: 13.5s
159:	learn: 0.0129535	total: 20.4s	remaining: 13.4s
160:	learn: 0.0128610	total: 20.6s	remaining: 13.3s
161:	learn: 0.0127838	total: 20.7s	remaining: 13.2s
162:	learn: 0.0126948	total: 20.8s	remaining: 13s
163:	learn: 0.0126129	total: 20.9s	remaining: 12.9s
164:	learn: 0.0125335	total: 21s	remaining: 12.7s
165:	learn: 0.0124338	total: 21.1s	remaining: 12.6s
166:	learn: 0.0123656	total: 21.2s	remaining: 12.4s
167:	learn: 0.0122937	total: 21.3s	remaining: 12.3s
168:	learn: 0.0121739	total: 21.4s	remaining: 12.1s
169:	learn: 0.0120602	total: 21.4s	remaining: 12s
170:	learn: 0.0119905	total: 21.5s	remaining: 11.8s
171:	learn: 0.0118581	total: 21.6s	remaining: 11.7s
172:	learn: 0.0117858	total: 21.7s	remaining: 11.6s
173:	learn: 0.0116833	total: 21.8s	remaining: 11.4s
174:	learn: 0.0115850	total: 21.9s	remaining: 11.3s
175:	learn: 0.0114962	total: 22s	remaining: 11.1s
176:	learn: 0.0114163	total: 22.1s	remaining: 11s
177:	learn: 0.0113300	total: 22.2s	remaining: 10.9s
178:	learn: 0.0112537	total: 22.3s	remaining: 10.7s
179:	learn: 0.0111742	total: 22.4s	remaining: 10.6s
180:	learn: 0.0111312	total: 22.5s	remaining: 10.4s
181:	learn: 0.0110592	total: 22.6s	remaining: 10.3s
182:	learn: 0.0110126	total: 22.7s	remaining: 10.2s
183:	learn: 0.0109101	total: 22.8s	remaining: 10s
184:	learn: 0.0108738	total: 22.9s	remaining: 9.89s
185:	learn: 0.0107682	total: 23s	remaining: 9.76s
186:	learn: 0.0106740	total: 23.1s	remaining: 9.62s
187:	learn: 0.0106179	total: 23.2s	remaining: 9.49s
188:	learn: 0.0105294	total: 23.3s	remaining: 9.37s
189:	learn: 0.0104808	total: 23.4s	remaining: 9.23s
190:	learn: 0.0104225	total: 23.5s	remaining: 9.09s
191:	learn: 0.0103795	total: 23.6s	remaining: 8.96s
192:	learn: 0.0103194	total: 23.7s	remaining: 8.82s
193:	learn: 0.0102473	total: 23.7s	remaining: 8.69s
194:	learn: 0.0101252	total: 23.9s	remaining: 8.56s
195:	learn: 0.0100811	total: 23.9s	remaining: 8.42s
196:	learn: 0.0100258	total: 24s	remaining: 8.29s
197:	learn: 0.0099423	total: 24.1s	remaining: 8.16s
198:	learn: 0.0098827	total: 24.2s	remaining: 8.04s
199:	learn: 0.0098373	total: 24.3s	remaining: 7.91s
200:	learn: 0.0097534	total: 24.4s	remaining: 7.78s
201:	learn: 0.0096799	total: 24.5s	remaining: 7.65s
202:	learn: 0.0096246	total: 24.6s	remaining: 7.51s
203:	learn: 0.0095597	total: 24.7s	remaining: 7.39s
204:	learn: 0.0095092	total: 24.8s	remaining: 7.26s
205:	learn: 0.0094501	total: 24.9s	remaining: 7.13s
206:	learn: 0.0094029	total: 25s	remaining: 7s
207:	learn: 0.0093541	total: 25.1s	remaining: 6.87s
208:	learn: 0.0092820	total: 25.2s	remaining: 6.74s
209:	learn: 0.0092212	total: 25.3s	remaining: 6.62s
210:	learn: 0.0091337	total: 25.4s	remaining: 6.5s
211:	learn: 0.0090623	total: 25.5s	remaining: 6.37s
212:	learn: 0.0089940	total: 25.6s	remaining: 6.25s
213:	learn: 0.0089310	total: 25.7s	remaining: 6.12s
214:	learn: 0.0088983	total: 25.8s	remaining: 5.99s
215:	learn: 0.0088203	total: 25.9s	remaining: 5.87s
216:	learn: 0.0087867	total: 25.9s	remaining: 5.74s
217:	learn: 0.0087232	total: 26s	remaining: 5.61s
218:	learn: 0.0086493	total: 26.1s	remaining: 5.49s
219:	learn: 0.0085885	total: 26.2s	remaining: 5.37s
220:	learn: 0.0085510	total: 26.3s	remaining: 5.24s
221:	learn: 0.0085301	total: 26.4s	remaining: 5.12s
222:	learn: 0.0084814	total: 26.5s	remaining: 4.99s
223:	learn: 0.0084207	total: 26.6s	remaining: 4.87s
224:	learn: 0.0083749	total: 26.7s	remaining: 4.75s
225:	learn: 0.0083045	total: 26.8s	remaining: 4.63s
226:	learn: 0.0082570	total: 26.9s	remaining: 4.5s
227:	learn: 0.0082196	total: 27s	remaining: 4.38s
228:	learn: 0.0081791	total: 27.1s	remaining: 4.26s
229:	learn: 0.0081384	total: 27.2s	remaining: 4.13s
230:	learn: 0.0080839	total: 27.3s	remaining: 4.02s
231:	learn: 0.0080226	total: 27.4s	remaining: 3.9s
232:	learn: 0.0079881	total: 27.5s	remaining: 3.77s
233:	learn: 0.0079568	total: 27.6s	remaining: 3.66s
234:	learn: 0.0079182	total: 27.7s	remaining: 3.54s
235:	learn: 0.0078683	total: 27.8s	remaining: 3.41s
236:	learn: 0.0077945	total: 27.9s	remaining: 3.29s
237:	learn: 0.0077265	total: 28s	remaining: 3.17s
238:	learn: 0.0076807	total: 28.1s	remaining: 3.05s
239:	learn: 0.0076562	total: 28.2s	remaining: 2.93s
240:	learn: 0.0076014	total: 28.3s	remaining: 2.81s
241:	learn: 0.0075715	total: 28.4s	remaining: 2.7s
242:	learn: 0.0075343	total: 28.5s	remaining: 2.58s
243:	learn: 0.0075006	total: 28.6s	remaining: 2.46s
244:	learn: 0.0074695	total: 28.6s	remaining: 2.34s
245:	learn: 0.0074284	total: 28.7s	remaining: 2.22s
246:	learn: 0.0073819	total: 28.8s	remaining: 2.1s
247:	learn: 0.0073332	total: 28.9s	remaining: 1.98s
248:	learn: 0.0073011	total: 29s	remaining: 1.86s
249:	learn: 0.0072593	total: 29.1s	remaining: 1.75s
250:	learn: 0.0072202	total: 29.2s	remaining: 1.63s
251:	learn: 0.0071961	total: 29.3s	remaining: 1.51s
252:	learn: 0.0071596	total: 29.4s	remaining: 1.39s
253:	learn: 0.0071214	total: 29.5s	remaining: 1.28s
254:	learn: 0.0070744	total: 29.6s	remaining: 1.16s
255:	learn: 0.0070116	total: 29.7s	remaining: 1.04s
256:	learn: 0.0069635	total: 29.8s	remaining: 927ms
257:	learn: 0.0069311	total: 29.9s	remaining: 811ms
258:	learn: 0.0068712	total: 30s	remaining: 695ms
259:	learn: 0.0068069	total: 30.1s	remaining: 579ms
260:	learn: 0.0067548	total: 30.2s	remaining: 463ms
261:	learn: 0.0067197	total: 30.3s	remaining: 347ms
262:	learn: 0.0066764	total: 30.4s	remaining: 231ms
263:	learn: 0.0066540	total: 30.5s	remaining: 115ms
264:	learn: 0.0066167	total: 30.6s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.11
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.55
 - F1-Score_Train: 99.55
 - Precision_Test: 12.60
 - Recall_Test: 87.30
 - AUPRC_Test: 73.13
 - Accuracy_Test: 98.96
 - F1-Score_Test: 22.02
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 265
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6309154	total: 163ms	remaining: 42.9s
1:	learn: 0.5567945	total: 333ms	remaining: 43.8s
2:	learn: 0.4968656	total: 536ms	remaining: 46.8s
3:	learn: 0.4487708	total: 711ms	remaining: 46.4s
4:	learn: 0.4058856	total: 896ms	remaining: 46.6s
5:	learn: 0.3765183	total: 1.05s	remaining: 45.5s
6:	learn: 0.3436615	total: 1.25s	remaining: 46.2s
7:	learn: 0.3140415	total: 1.44s	remaining: 46.2s
8:	learn: 0.2877252	total: 1.62s	remaining: 46.1s
9:	learn: 0.2681901	total: 1.81s	remaining: 46.2s
10:	learn: 0.2481250	total: 1.98s	remaining: 45.8s
11:	learn: 0.2326290	total: 2.16s	remaining: 45.5s
12:	learn: 0.2179586	total: 2.28s	remaining: 44.2s
13:	learn: 0.2025394	total: 2.37s	remaining: 42.5s
14:	learn: 0.1894092	total: 2.46s	remaining: 41s
15:	learn: 0.1804487	total: 2.57s	remaining: 40.1s
16:	learn: 0.1685968	total: 2.68s	remaining: 39.1s
17:	learn: 0.1606545	total: 2.77s	remaining: 38s
18:	learn: 0.1542026	total: 2.87s	remaining: 37.1s
19:	learn: 0.1475476	total: 2.96s	remaining: 36.2s
20:	learn: 0.1413415	total: 3.04s	remaining: 35.4s
21:	learn: 0.1367797	total: 3.15s	remaining: 34.8s
22:	learn: 0.1287261	total: 3.25s	remaining: 34.2s
23:	learn: 0.1240893	total: 3.33s	remaining: 33.4s
24:	learn: 0.1192564	total: 3.43s	remaining: 32.9s
25:	learn: 0.1149912	total: 3.52s	remaining: 32.3s
26:	learn: 0.1112787	total: 3.6s	remaining: 31.8s
27:	learn: 0.1077421	total: 3.73s	remaining: 31.6s
28:	learn: 0.1042036	total: 3.83s	remaining: 31.1s
29:	learn: 0.1000151	total: 3.92s	remaining: 30.7s
30:	learn: 0.0970256	total: 4.03s	remaining: 30.4s
31:	learn: 0.0940606	total: 4.12s	remaining: 30s
32:	learn: 0.0919688	total: 4.2s	remaining: 29.6s
33:	learn: 0.0887040	total: 4.32s	remaining: 29.3s
34:	learn: 0.0857139	total: 4.41s	remaining: 29s
35:	learn: 0.0834470	total: 4.5s	remaining: 28.6s
36:	learn: 0.0810159	total: 4.61s	remaining: 28.4s
37:	learn: 0.0794472	total: 4.7s	remaining: 28.1s
38:	learn: 0.0776469	total: 4.8s	remaining: 27.8s
39:	learn: 0.0754983	total: 4.93s	remaining: 27.7s
40:	learn: 0.0735063	total: 5.02s	remaining: 27.4s
41:	learn: 0.0724505	total: 5.11s	remaining: 27.1s
42:	learn: 0.0704977	total: 5.23s	remaining: 27s
43:	learn: 0.0691250	total: 5.32s	remaining: 26.7s
44:	learn: 0.0679830	total: 5.41s	remaining: 26.5s
45:	learn: 0.0667594	total: 5.52s	remaining: 26.3s
46:	learn: 0.0653461	total: 5.61s	remaining: 26s
47:	learn: 0.0639869	total: 5.71s	remaining: 25.8s
48:	learn: 0.0631415	total: 5.83s	remaining: 25.7s
49:	learn: 0.0618373	total: 5.92s	remaining: 25.4s
50:	learn: 0.0608642	total: 6.01s	remaining: 25.2s
51:	learn: 0.0598249	total: 6.12s	remaining: 25.1s
52:	learn: 0.0586505	total: 6.21s	remaining: 24.8s
53:	learn: 0.0580238	total: 6.3s	remaining: 24.6s
54:	learn: 0.0572946	total: 6.41s	remaining: 24.5s
55:	learn: 0.0564076	total: 6.5s	remaining: 24.3s
56:	learn: 0.0555893	total: 6.6s	remaining: 24.1s
57:	learn: 0.0547661	total: 6.71s	remaining: 24s
58:	learn: 0.0538504	total: 6.83s	remaining: 23.8s
59:	learn: 0.0530325	total: 6.92s	remaining: 23.6s
60:	learn: 0.0523269	total: 7.02s	remaining: 23.5s
61:	learn: 0.0514260	total: 7.1s	remaining: 23.3s
62:	learn: 0.0505233	total: 7.21s	remaining: 23.1s
63:	learn: 0.0497810	total: 7.3s	remaining: 22.9s
64:	learn: 0.0488592	total: 7.39s	remaining: 22.8s
65:	learn: 0.0481341	total: 7.51s	remaining: 22.6s
66:	learn: 0.0474301	total: 7.6s	remaining: 22.5s
67:	learn: 0.0466898	total: 7.7s	remaining: 22.3s
68:	learn: 0.0458407	total: 7.83s	remaining: 22.2s
69:	learn: 0.0450813	total: 7.93s	remaining: 22.1s
70:	learn: 0.0443748	total: 8.01s	remaining: 21.9s
71:	learn: 0.0439191	total: 8.12s	remaining: 21.8s
72:	learn: 0.0434032	total: 8.21s	remaining: 21.6s
73:	learn: 0.0430135	total: 8.3s	remaining: 21.4s
74:	learn: 0.0426299	total: 8.41s	remaining: 21.3s
75:	learn: 0.0420788	total: 8.49s	remaining: 21.1s
76:	learn: 0.0415528	total: 8.58s	remaining: 21s
77:	learn: 0.0411854	total: 8.7s	remaining: 20.9s
78:	learn: 0.0408460	total: 8.78s	remaining: 20.7s
79:	learn: 0.0404860	total: 8.88s	remaining: 20.5s
80:	learn: 0.0397755	total: 9.01s	remaining: 20.5s
81:	learn: 0.0393745	total: 9.09s	remaining: 20.3s
82:	learn: 0.0389966	total: 9.18s	remaining: 20.1s
83:	learn: 0.0386332	total: 9.29s	remaining: 20s
84:	learn: 0.0381622	total: 9.38s	remaining: 19.9s
85:	learn: 0.0377823	total: 9.47s	remaining: 19.7s
86:	learn: 0.0375534	total: 9.58s	remaining: 19.6s
87:	learn: 0.0370197	total: 9.67s	remaining: 19.5s
88:	learn: 0.0367033	total: 9.77s	remaining: 19.3s
89:	learn: 0.0363800	total: 9.89s	remaining: 19.2s
90:	learn: 0.0361488	total: 9.97s	remaining: 19.1s
91:	learn: 0.0357671	total: 10.1s	remaining: 18.9s
92:	learn: 0.0352585	total: 10.2s	remaining: 18.9s
93:	learn: 0.0346806	total: 10.3s	remaining: 18.7s
94:	learn: 0.0343543	total: 10.4s	remaining: 18.6s
95:	learn: 0.0340705	total: 10.5s	remaining: 18.5s
96:	learn: 0.0338604	total: 10.6s	remaining: 18.3s
97:	learn: 0.0334683	total: 10.7s	remaining: 18.2s
98:	learn: 0.0330825	total: 10.8s	remaining: 18.1s
99:	learn: 0.0327043	total: 10.9s	remaining: 17.9s
100:	learn: 0.0323787	total: 11s	remaining: 17.8s
101:	learn: 0.0321113	total: 11.1s	remaining: 17.7s
102:	learn: 0.0318386	total: 11.1s	remaining: 17.5s
103:	learn: 0.0315188	total: 11.2s	remaining: 17.4s
104:	learn: 0.0313165	total: 11.3s	remaining: 17.3s
105:	learn: 0.0309823	total: 11.4s	remaining: 17.1s
106:	learn: 0.0306018	total: 11.5s	remaining: 17s
107:	learn: 0.0303500	total: 11.6s	remaining: 16.9s
108:	learn: 0.0300120	total: 11.7s	remaining: 16.8s
109:	learn: 0.0297448	total: 11.8s	remaining: 16.6s
110:	learn: 0.0294939	total: 11.9s	remaining: 16.5s
111:	learn: 0.0291636	total: 12s	remaining: 16.4s
112:	learn: 0.0289466	total: 12.1s	remaining: 16.3s
113:	learn: 0.0286996	total: 12.2s	remaining: 16.2s
114:	learn: 0.0283410	total: 12.4s	remaining: 16.2s
115:	learn: 0.0279418	total: 12.6s	remaining: 16.2s
116:	learn: 0.0275687	total: 12.8s	remaining: 16.2s
117:	learn: 0.0273213	total: 13s	remaining: 16.1s
118:	learn: 0.0270299	total: 13.1s	remaining: 16.1s
119:	learn: 0.0267702	total: 13.3s	remaining: 16.1s
120:	learn: 0.0266065	total: 13.5s	remaining: 16s
121:	learn: 0.0262828	total: 13.7s	remaining: 16s
122:	learn: 0.0260453	total: 13.8s	remaining: 16s
123:	learn: 0.0258251	total: 14s	remaining: 15.9s
124:	learn: 0.0255264	total: 14.2s	remaining: 15.9s
125:	learn: 0.0251616	total: 14.4s	remaining: 15.9s
126:	learn: 0.0250274	total: 14.5s	remaining: 15.8s
127:	learn: 0.0248056	total: 14.7s	remaining: 15.8s
128:	learn: 0.0245936	total: 14.9s	remaining: 15.7s
129:	learn: 0.0243948	total: 15.1s	remaining: 15.7s
130:	learn: 0.0241784	total: 15.3s	remaining: 15.6s
131:	learn: 0.0240187	total: 15.4s	remaining: 15.5s
132:	learn: 0.0238183	total: 15.6s	remaining: 15.5s
133:	learn: 0.0236149	total: 15.8s	remaining: 15.4s
134:	learn: 0.0234579	total: 15.9s	remaining: 15.4s
135:	learn: 0.0232299	total: 16.1s	remaining: 15.3s
136:	learn: 0.0230204	total: 16.3s	remaining: 15.2s
137:	learn: 0.0227926	total: 16.5s	remaining: 15.2s
138:	learn: 0.0225544	total: 16.7s	remaining: 15.1s
139:	learn: 0.0223898	total: 16.9s	remaining: 15s
140:	learn: 0.0222245	total: 17s	remaining: 15s
141:	learn: 0.0219812	total: 17.2s	remaining: 14.9s
142:	learn: 0.0218055	total: 17.4s	remaining: 14.8s
143:	learn: 0.0215954	total: 17.6s	remaining: 14.7s
144:	learn: 0.0214032	total: 17.7s	remaining: 14.7s
145:	learn: 0.0212493	total: 17.9s	remaining: 14.6s
146:	learn: 0.0211541	total: 17.9s	remaining: 14.4s
147:	learn: 0.0209307	total: 18s	remaining: 14.3s
148:	learn: 0.0207570	total: 18.2s	remaining: 14.1s
149:	learn: 0.0205734	total: 18.2s	remaining: 14s
150:	learn: 0.0204740	total: 18.3s	remaining: 13.8s
151:	learn: 0.0202997	total: 18.4s	remaining: 13.7s
152:	learn: 0.0202048	total: 18.5s	remaining: 13.6s
153:	learn: 0.0200303	total: 18.6s	remaining: 13.4s
154:	learn: 0.0198623	total: 18.7s	remaining: 13.3s
155:	learn: 0.0197697	total: 18.8s	remaining: 13.1s
156:	learn: 0.0196303	total: 18.9s	remaining: 13s
157:	learn: 0.0195509	total: 19s	remaining: 12.9s
158:	learn: 0.0194022	total: 19.1s	remaining: 12.7s
159:	learn: 0.0191743	total: 19.2s	remaining: 12.6s
160:	learn: 0.0190087	total: 19.3s	remaining: 12.5s
161:	learn: 0.0189098	total: 19.4s	remaining: 12.3s
162:	learn: 0.0188132	total: 19.5s	remaining: 12.2s
163:	learn: 0.0187269	total: 19.6s	remaining: 12.1s
164:	learn: 0.0185746	total: 19.7s	remaining: 11.9s
165:	learn: 0.0184208	total: 19.8s	remaining: 11.8s
166:	learn: 0.0182745	total: 19.9s	remaining: 11.7s
167:	learn: 0.0181721	total: 19.9s	remaining: 11.5s
168:	learn: 0.0180316	total: 20s	remaining: 11.4s
169:	learn: 0.0179034	total: 20.1s	remaining: 11.3s
170:	learn: 0.0176791	total: 20.2s	remaining: 11.1s
171:	learn: 0.0175708	total: 20.3s	remaining: 11s
172:	learn: 0.0174262	total: 20.4s	remaining: 10.9s
173:	learn: 0.0172802	total: 20.5s	remaining: 10.7s
174:	learn: 0.0171730	total: 20.6s	remaining: 10.6s
175:	learn: 0.0171005	total: 20.7s	remaining: 10.5s
176:	learn: 0.0170235	total: 20.8s	remaining: 10.4s
177:	learn: 0.0169297	total: 20.9s	remaining: 10.2s
178:	learn: 0.0168448	total: 21s	remaining: 10.1s
179:	learn: 0.0166550	total: 21.1s	remaining: 9.96s
180:	learn: 0.0165533	total: 21.2s	remaining: 9.83s
181:	learn: 0.0164533	total: 21.3s	remaining: 9.72s
182:	learn: 0.0163334	total: 21.4s	remaining: 9.6s
183:	learn: 0.0162664	total: 21.5s	remaining: 9.46s
184:	learn: 0.0161770	total: 21.6s	remaining: 9.34s
185:	learn: 0.0160918	total: 21.7s	remaining: 9.21s
186:	learn: 0.0159630	total: 21.8s	remaining: 9.08s
187:	learn: 0.0158001	total: 21.9s	remaining: 8.96s
188:	learn: 0.0156651	total: 22s	remaining: 8.83s
189:	learn: 0.0155776	total: 22.1s	remaining: 8.71s
190:	learn: 0.0154848	total: 22.2s	remaining: 8.59s
191:	learn: 0.0153950	total: 22.3s	remaining: 8.46s
192:	learn: 0.0153000	total: 22.3s	remaining: 8.34s
193:	learn: 0.0152181	total: 22.5s	remaining: 8.22s
194:	learn: 0.0151289	total: 22.5s	remaining: 8.09s
195:	learn: 0.0149991	total: 22.6s	remaining: 7.97s
196:	learn: 0.0149209	total: 22.7s	remaining: 7.85s
197:	learn: 0.0147935	total: 22.8s	remaining: 7.72s
198:	learn: 0.0146602	total: 22.9s	remaining: 7.6s
199:	learn: 0.0146124	total: 23s	remaining: 7.48s
200:	learn: 0.0145549	total: 23.1s	remaining: 7.36s
201:	learn: 0.0144758	total: 23.2s	remaining: 7.23s
202:	learn: 0.0143997	total: 23.3s	remaining: 7.11s
203:	learn: 0.0142672	total: 23.4s	remaining: 6.99s
204:	learn: 0.0142226	total: 23.5s	remaining: 6.87s
205:	learn: 0.0141085	total: 23.6s	remaining: 6.75s
206:	learn: 0.0140191	total: 23.7s	remaining: 6.63s
207:	learn: 0.0138925	total: 23.8s	remaining: 6.51s
208:	learn: 0.0137671	total: 23.9s	remaining: 6.4s
209:	learn: 0.0136548	total: 24s	remaining: 6.28s
210:	learn: 0.0135495	total: 24.1s	remaining: 6.16s
211:	learn: 0.0134905	total: 24.2s	remaining: 6.04s
212:	learn: 0.0133887	total: 24.3s	remaining: 5.92s
213:	learn: 0.0133263	total: 24.3s	remaining: 5.8s
214:	learn: 0.0132332	total: 24.5s	remaining: 5.69s
215:	learn: 0.0131518	total: 24.6s	remaining: 5.57s
216:	learn: 0.0130861	total: 24.6s	remaining: 5.45s
217:	learn: 0.0130090	total: 24.8s	remaining: 5.34s
218:	learn: 0.0128888	total: 24.8s	remaining: 5.22s
219:	learn: 0.0128456	total: 24.9s	remaining: 5.1s
220:	learn: 0.0127792	total: 25s	remaining: 4.99s
221:	learn: 0.0126922	total: 25.1s	remaining: 4.87s
222:	learn: 0.0125946	total: 25.2s	remaining: 4.75s
223:	learn: 0.0125047	total: 25.3s	remaining: 4.64s
224:	learn: 0.0124246	total: 25.4s	remaining: 4.52s
225:	learn: 0.0123391	total: 25.5s	remaining: 4.41s
226:	learn: 0.0122637	total: 25.7s	remaining: 4.29s
227:	learn: 0.0121759	total: 25.7s	remaining: 4.18s
228:	learn: 0.0121262	total: 25.8s	remaining: 4.06s
229:	learn: 0.0120505	total: 25.9s	remaining: 3.94s
230:	learn: 0.0119803	total: 26s	remaining: 3.83s
231:	learn: 0.0118907	total: 26.1s	remaining: 3.71s
232:	learn: 0.0118047	total: 26.2s	remaining: 3.6s
233:	learn: 0.0117268	total: 26.3s	remaining: 3.48s
234:	learn: 0.0116662	total: 26.4s	remaining: 3.37s
235:	learn: 0.0115918	total: 26.5s	remaining: 3.26s
236:	learn: 0.0114934	total: 26.6s	remaining: 3.14s
237:	learn: 0.0114120	total: 26.7s	remaining: 3.03s
238:	learn: 0.0113163	total: 26.8s	remaining: 2.92s
239:	learn: 0.0112277	total: 26.9s	remaining: 2.8s
240:	learn: 0.0111692	total: 27s	remaining: 2.69s
241:	learn: 0.0111090	total: 27.1s	remaining: 2.57s
242:	learn: 0.0110346	total: 27.2s	remaining: 2.46s
243:	learn: 0.0110038	total: 27.3s	remaining: 2.35s
244:	learn: 0.0109254	total: 27.4s	remaining: 2.23s
245:	learn: 0.0108233	total: 27.5s	remaining: 2.12s
246:	learn: 0.0107910	total: 27.6s	remaining: 2.01s
247:	learn: 0.0107155	total: 27.7s	remaining: 1.9s
248:	learn: 0.0106576	total: 27.8s	remaining: 1.78s
249:	learn: 0.0105813	total: 28s	remaining: 1.68s
250:	learn: 0.0105119	total: 28.1s	remaining: 1.57s
251:	learn: 0.0104525	total: 28.3s	remaining: 1.46s
252:	learn: 0.0103857	total: 28.4s	remaining: 1.35s
253:	learn: 0.0103191	total: 28.6s	remaining: 1.24s
254:	learn: 0.0102603	total: 28.8s	remaining: 1.13s
255:	learn: 0.0102018	total: 29s	remaining: 1.02s
256:	learn: 0.0101625	total: 29.1s	remaining: 907ms
257:	learn: 0.0101416	total: 29.3s	remaining: 795ms
258:	learn: 0.0100640	total: 29.5s	remaining: 683ms
259:	learn: 0.0100126	total: 29.6s	remaining: 570ms
260:	learn: 0.0099823	total: 29.8s	remaining: 457ms
261:	learn: 0.0099428	total: 30s	remaining: 344ms
262:	learn: 0.0099199	total: 30.2s	remaining: 229ms
263:	learn: 0.0098876	total: 30.3s	remaining: 115ms
264:	learn: 0.0098180	total: 30.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.84
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.41
 - F1-Score_Train: 99.41
 - Precision_Test: 11.48
 - Recall_Test: 89.68
 - AUPRC_Test: 71.45
 - Accuracy_Test: 98.82
 - F1-Score_Test: 20.36
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 265
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.6288953	total: 86.6ms	remaining: 22.9s
1:	learn: 0.5690307	total: 192ms	remaining: 25.3s
2:	learn: 0.5074231	total: 285ms	remaining: 24.9s
3:	learn: 0.4637357	total: 389ms	remaining: 25.4s
4:	learn: 0.4151384	total: 477ms	remaining: 24.8s
5:	learn: 0.3745992	total: 562ms	remaining: 24.2s
6:	learn: 0.3443889	total: 664ms	remaining: 24.5s
7:	learn: 0.3075463	total: 754ms	remaining: 24.2s
8:	learn: 0.2852573	total: 837ms	remaining: 23.8s
9:	learn: 0.2602947	total: 954ms	remaining: 24.3s
10:	learn: 0.2431951	total: 1.03s	remaining: 23.9s
11:	learn: 0.2250331	total: 1.11s	remaining: 23.5s
12:	learn: 0.2039313	total: 1.24s	remaining: 24.1s
13:	learn: 0.1912331	total: 1.32s	remaining: 23.8s
14:	learn: 0.1746942	total: 1.42s	remaining: 23.7s
15:	learn: 0.1646845	total: 1.52s	remaining: 23.7s
16:	learn: 0.1563309	total: 1.6s	remaining: 23.4s
17:	learn: 0.1472027	total: 1.69s	remaining: 23.2s
18:	learn: 0.1392273	total: 1.82s	remaining: 23.6s
19:	learn: 0.1318805	total: 1.91s	remaining: 23.5s
20:	learn: 0.1252934	total: 2s	remaining: 23.2s
21:	learn: 0.1205503	total: 2.11s	remaining: 23.3s
22:	learn: 0.1158005	total: 2.2s	remaining: 23.1s
23:	learn: 0.1112862	total: 2.3s	remaining: 23.1s
24:	learn: 0.1062196	total: 2.41s	remaining: 23.1s
25:	learn: 0.1023418	total: 2.49s	remaining: 22.9s
26:	learn: 0.0990703	total: 2.58s	remaining: 22.8s
27:	learn: 0.0957620	total: 2.68s	remaining: 22.7s
28:	learn: 0.0933196	total: 2.77s	remaining: 22.5s
29:	learn: 0.0906521	total: 2.85s	remaining: 22.4s
30:	learn: 0.0878749	total: 2.97s	remaining: 22.4s
31:	learn: 0.0854411	total: 3.05s	remaining: 22.2s
32:	learn: 0.0835373	total: 3.14s	remaining: 22.1s
33:	learn: 0.0810439	total: 3.24s	remaining: 22s
34:	learn: 0.0788550	total: 3.35s	remaining: 22s
35:	learn: 0.0763324	total: 3.44s	remaining: 21.9s
36:	learn: 0.0743247	total: 3.54s	remaining: 21.8s
37:	learn: 0.0723772	total: 3.64s	remaining: 21.7s
38:	learn: 0.0707428	total: 3.72s	remaining: 21.5s
39:	learn: 0.0689275	total: 3.83s	remaining: 21.5s
40:	learn: 0.0675302	total: 3.92s	remaining: 21.4s
41:	learn: 0.0658974	total: 4.01s	remaining: 21.3s
42:	learn: 0.0642493	total: 4.13s	remaining: 21.3s
43:	learn: 0.0629056	total: 4.21s	remaining: 21.1s
44:	learn: 0.0615557	total: 4.32s	remaining: 21.1s
45:	learn: 0.0599552	total: 4.43s	remaining: 21.1s
46:	learn: 0.0588964	total: 4.51s	remaining: 20.9s
47:	learn: 0.0579268	total: 4.6s	remaining: 20.8s
48:	learn: 0.0564522	total: 4.7s	remaining: 20.7s
49:	learn: 0.0553714	total: 4.79s	remaining: 20.6s
50:	learn: 0.0538641	total: 4.89s	remaining: 20.5s
51:	learn: 0.0523061	total: 5s	remaining: 20.5s
52:	learn: 0.0515375	total: 5.09s	remaining: 20.4s
53:	learn: 0.0506917	total: 5.18s	remaining: 20.2s
54:	learn: 0.0498444	total: 5.29s	remaining: 20.2s
55:	learn: 0.0488839	total: 5.4s	remaining: 20.1s
56:	learn: 0.0477331	total: 5.5s	remaining: 20.1s
57:	learn: 0.0468469	total: 5.61s	remaining: 20s
58:	learn: 0.0461699	total: 5.69s	remaining: 19.9s
59:	learn: 0.0454799	total: 5.82s	remaining: 19.9s
60:	learn: 0.0448467	total: 5.92s	remaining: 19.8s
61:	learn: 0.0441919	total: 6s	remaining: 19.7s
62:	learn: 0.0435373	total: 6.12s	remaining: 19.6s
63:	learn: 0.0429227	total: 6.21s	remaining: 19.5s
64:	learn: 0.0421693	total: 6.3s	remaining: 19.4s
65:	learn: 0.0416073	total: 6.44s	remaining: 19.4s
66:	learn: 0.0409016	total: 6.54s	remaining: 19.3s
67:	learn: 0.0401813	total: 6.63s	remaining: 19.2s
68:	learn: 0.0397656	total: 6.74s	remaining: 19.1s
69:	learn: 0.0391464	total: 6.83s	remaining: 19s
70:	learn: 0.0384804	total: 6.92s	remaining: 18.9s
71:	learn: 0.0379140	total: 7.03s	remaining: 18.9s
72:	learn: 0.0373362	total: 7.13s	remaining: 18.7s
73:	learn: 0.0367749	total: 7.21s	remaining: 18.6s
74:	learn: 0.0363680	total: 7.32s	remaining: 18.6s
75:	learn: 0.0360994	total: 7.42s	remaining: 18.5s
76:	learn: 0.0356613	total: 7.52s	remaining: 18.4s
77:	learn: 0.0353137	total: 7.63s	remaining: 18.3s
78:	learn: 0.0348716	total: 7.71s	remaining: 18.2s
79:	learn: 0.0343350	total: 7.8s	remaining: 18s
80:	learn: 0.0339654	total: 7.92s	remaining: 18s
81:	learn: 0.0335387	total: 8.01s	remaining: 17.9s
82:	learn: 0.0331798	total: 8.1s	remaining: 17.8s
83:	learn: 0.0326574	total: 8.22s	remaining: 17.7s
84:	learn: 0.0321901	total: 8.3s	remaining: 17.6s
85:	learn: 0.0318237	total: 8.39s	remaining: 17.5s
86:	learn: 0.0314167	total: 8.52s	remaining: 17.4s
87:	learn: 0.0311311	total: 8.6s	remaining: 17.3s
88:	learn: 0.0308924	total: 8.69s	remaining: 17.2s
89:	learn: 0.0304497	total: 8.8s	remaining: 17.1s
90:	learn: 0.0302310	total: 8.88s	remaining: 17s
91:	learn: 0.0298333	total: 8.98s	remaining: 16.9s
92:	learn: 0.0294627	total: 9.09s	remaining: 16.8s
93:	learn: 0.0291008	total: 9.18s	remaining: 16.7s
94:	learn: 0.0288052	total: 9.28s	remaining: 16.6s
95:	learn: 0.0286223	total: 9.37s	remaining: 16.5s
96:	learn: 0.0284217	total: 9.46s	remaining: 16.4s
97:	learn: 0.0281779	total: 9.6s	remaining: 16.4s
98:	learn: 0.0279517	total: 9.74s	remaining: 16.3s
99:	learn: 0.0276784	total: 9.92s	remaining: 16.4s
100:	learn: 0.0274976	total: 10.1s	remaining: 16.4s
101:	learn: 0.0272664	total: 10.2s	remaining: 16.3s
102:	learn: 0.0270078	total: 10.4s	remaining: 16.4s
103:	learn: 0.0266596	total: 10.6s	remaining: 16.4s
104:	learn: 0.0264845	total: 10.8s	remaining: 16.4s
105:	learn: 0.0261608	total: 11s	remaining: 16.4s
106:	learn: 0.0258365	total: 11.1s	remaining: 16.4s
107:	learn: 0.0254664	total: 11.3s	remaining: 16.4s
108:	learn: 0.0252106	total: 11.5s	remaining: 16.4s
109:	learn: 0.0249894	total: 11.7s	remaining: 16.4s
110:	learn: 0.0248677	total: 11.8s	remaining: 16.4s
111:	learn: 0.0246480	total: 12s	remaining: 16.4s
112:	learn: 0.0244919	total: 12.2s	remaining: 16.4s
113:	learn: 0.0242183	total: 12.4s	remaining: 16.4s
114:	learn: 0.0239974	total: 12.6s	remaining: 16.4s
115:	learn: 0.0237066	total: 12.8s	remaining: 16.4s
116:	learn: 0.0235499	total: 12.9s	remaining: 16.4s
117:	learn: 0.0233634	total: 13.1s	remaining: 16.4s
118:	learn: 0.0231206	total: 13.3s	remaining: 16.3s
119:	learn: 0.0229922	total: 13.5s	remaining: 16.3s
120:	learn: 0.0228579	total: 13.7s	remaining: 16.3s
121:	learn: 0.0225667	total: 13.9s	remaining: 16.3s
122:	learn: 0.0223073	total: 14s	remaining: 16.2s
123:	learn: 0.0221920	total: 14.2s	remaining: 16.2s
124:	learn: 0.0219913	total: 14.4s	remaining: 16.1s
125:	learn: 0.0217592	total: 14.6s	remaining: 16.1s
126:	learn: 0.0215307	total: 14.8s	remaining: 16.1s
127:	learn: 0.0213221	total: 15s	remaining: 16s
128:	learn: 0.0211447	total: 15.1s	remaining: 15.9s
129:	learn: 0.0209461	total: 15.2s	remaining: 15.8s
130:	learn: 0.0208320	total: 15.3s	remaining: 15.6s
131:	learn: 0.0207089	total: 15.4s	remaining: 15.5s
132:	learn: 0.0204943	total: 15.5s	remaining: 15.4s
133:	learn: 0.0202751	total: 15.6s	remaining: 15.2s
134:	learn: 0.0201687	total: 15.7s	remaining: 15.1s
135:	learn: 0.0199877	total: 15.8s	remaining: 14.9s
136:	learn: 0.0197782	total: 15.9s	remaining: 14.8s
137:	learn: 0.0195722	total: 16s	remaining: 14.7s
138:	learn: 0.0193302	total: 16.1s	remaining: 14.6s
139:	learn: 0.0192150	total: 16.2s	remaining: 14.4s
140:	learn: 0.0190289	total: 16.3s	remaining: 14.3s
141:	learn: 0.0189626	total: 16.4s	remaining: 14.2s
142:	learn: 0.0187635	total: 16.5s	remaining: 14s
143:	learn: 0.0186726	total: 16.5s	remaining: 13.9s
144:	learn: 0.0185498	total: 16.6s	remaining: 13.8s
145:	learn: 0.0184460	total: 16.7s	remaining: 13.6s
146:	learn: 0.0182427	total: 16.8s	remaining: 13.5s
147:	learn: 0.0181531	total: 16.9s	remaining: 13.4s
148:	learn: 0.0179965	total: 17s	remaining: 13.3s
149:	learn: 0.0178432	total: 17.1s	remaining: 13.1s
150:	learn: 0.0176738	total: 17.2s	remaining: 13s
151:	learn: 0.0175692	total: 17.3s	remaining: 12.9s
152:	learn: 0.0173777	total: 17.4s	remaining: 12.7s
153:	learn: 0.0172455	total: 17.5s	remaining: 12.6s
154:	learn: 0.0171001	total: 17.6s	remaining: 12.5s
155:	learn: 0.0169029	total: 17.7s	remaining: 12.4s
156:	learn: 0.0167944	total: 17.8s	remaining: 12.2s
157:	learn: 0.0166651	total: 17.9s	remaining: 12.1s
158:	learn: 0.0165077	total: 18s	remaining: 12s
159:	learn: 0.0163912	total: 18.1s	remaining: 11.9s
160:	learn: 0.0162479	total: 18.2s	remaining: 11.8s
161:	learn: 0.0160846	total: 18.3s	remaining: 11.6s
162:	learn: 0.0159630	total: 18.4s	remaining: 11.5s
163:	learn: 0.0158970	total: 18.5s	remaining: 11.4s
164:	learn: 0.0157885	total: 18.6s	remaining: 11.3s
165:	learn: 0.0156811	total: 18.7s	remaining: 11.2s
166:	learn: 0.0155205	total: 18.8s	remaining: 11s
167:	learn: 0.0153947	total: 18.9s	remaining: 10.9s
168:	learn: 0.0153181	total: 19s	remaining: 10.8s
169:	learn: 0.0152175	total: 19.1s	remaining: 10.7s
170:	learn: 0.0151024	total: 19.2s	remaining: 10.5s
171:	learn: 0.0150337	total: 19.3s	remaining: 10.4s
172:	learn: 0.0149625	total: 19.4s	remaining: 10.3s
173:	learn: 0.0148276	total: 19.4s	remaining: 10.2s
174:	learn: 0.0147446	total: 19.6s	remaining: 10.1s
175:	learn: 0.0146546	total: 19.6s	remaining: 9.93s
176:	learn: 0.0145398	total: 19.7s	remaining: 9.81s
177:	learn: 0.0144326	total: 19.9s	remaining: 9.7s
178:	learn: 0.0143789	total: 19.9s	remaining: 9.58s
179:	learn: 0.0143119	total: 20s	remaining: 9.46s
180:	learn: 0.0142292	total: 20.1s	remaining: 9.35s
181:	learn: 0.0141498	total: 20.2s	remaining: 9.22s
182:	learn: 0.0140508	total: 20.3s	remaining: 9.1s
183:	learn: 0.0139956	total: 20.4s	remaining: 8.98s
184:	learn: 0.0139103	total: 20.5s	remaining: 8.86s
185:	learn: 0.0137885	total: 20.6s	remaining: 8.74s
186:	learn: 0.0136574	total: 20.7s	remaining: 8.63s
187:	learn: 0.0135390	total: 20.8s	remaining: 8.51s
188:	learn: 0.0134919	total: 20.9s	remaining: 8.39s
189:	learn: 0.0134138	total: 21s	remaining: 8.28s
190:	learn: 0.0133540	total: 21.1s	remaining: 8.16s
191:	learn: 0.0132486	total: 21.2s	remaining: 8.04s
192:	learn: 0.0131089	total: 21.3s	remaining: 7.93s
193:	learn: 0.0130312	total: 21.4s	remaining: 7.82s
194:	learn: 0.0129842	total: 21.4s	remaining: 7.7s
195:	learn: 0.0128637	total: 21.6s	remaining: 7.59s
196:	learn: 0.0127723	total: 21.6s	remaining: 7.47s
197:	learn: 0.0126884	total: 21.7s	remaining: 7.35s
198:	learn: 0.0126138	total: 21.8s	remaining: 7.24s
199:	learn: 0.0125552	total: 21.9s	remaining: 7.12s
200:	learn: 0.0124903	total: 22s	remaining: 7.01s
201:	learn: 0.0124593	total: 22.2s	remaining: 6.91s
202:	learn: 0.0123732	total: 22.2s	remaining: 6.79s
203:	learn: 0.0123034	total: 22.3s	remaining: 6.68s
204:	learn: 0.0122588	total: 22.4s	remaining: 6.57s
205:	learn: 0.0121932	total: 22.5s	remaining: 6.45s
206:	learn: 0.0121042	total: 22.6s	remaining: 6.33s
207:	learn: 0.0120513	total: 22.7s	remaining: 6.22s
208:	learn: 0.0119349	total: 22.8s	remaining: 6.11s
209:	learn: 0.0118500	total: 22.9s	remaining: 6s
210:	learn: 0.0117565	total: 23s	remaining: 5.89s
211:	learn: 0.0116646	total: 23.1s	remaining: 5.77s
212:	learn: 0.0115995	total: 23.2s	remaining: 5.66s
213:	learn: 0.0115628	total: 23.3s	remaining: 5.55s
214:	learn: 0.0114758	total: 23.4s	remaining: 5.44s
215:	learn: 0.0114246	total: 23.5s	remaining: 5.32s
216:	learn: 0.0113251	total: 23.6s	remaining: 5.21s
217:	learn: 0.0112491	total: 23.7s	remaining: 5.1s
218:	learn: 0.0111971	total: 23.7s	remaining: 4.99s
219:	learn: 0.0111386	total: 23.9s	remaining: 4.88s
220:	learn: 0.0111100	total: 24s	remaining: 4.77s
221:	learn: 0.0110175	total: 24s	remaining: 4.66s
222:	learn: 0.0109330	total: 24.2s	remaining: 4.55s
223:	learn: 0.0108607	total: 24.3s	remaining: 4.44s
224:	learn: 0.0107941	total: 24.4s	remaining: 4.33s
225:	learn: 0.0107342	total: 24.5s	remaining: 4.22s
226:	learn: 0.0106444	total: 24.6s	remaining: 4.11s
227:	learn: 0.0106122	total: 24.6s	remaining: 4s
228:	learn: 0.0105248	total: 24.7s	remaining: 3.89s
229:	learn: 0.0104293	total: 24.8s	remaining: 3.78s
230:	learn: 0.0103238	total: 24.9s	remaining: 3.67s
231:	learn: 0.0102438	total: 25.1s	remaining: 3.56s
232:	learn: 0.0101780	total: 25.2s	remaining: 3.46s
233:	learn: 0.0101362	total: 25.4s	remaining: 3.36s
234:	learn: 0.0100816	total: 25.5s	remaining: 3.26s
235:	learn: 0.0100573	total: 25.7s	remaining: 3.16s
236:	learn: 0.0100010	total: 25.9s	remaining: 3.06s
237:	learn: 0.0099308	total: 26s	remaining: 2.95s
238:	learn: 0.0098862	total: 26.2s	remaining: 2.85s
239:	learn: 0.0097785	total: 26.4s	remaining: 2.75s
240:	learn: 0.0097215	total: 26.6s	remaining: 2.65s
241:	learn: 0.0096719	total: 26.7s	remaining: 2.54s
242:	learn: 0.0096278	total: 26.9s	remaining: 2.44s
243:	learn: 0.0096023	total: 27.1s	remaining: 2.33s
244:	learn: 0.0095458	total: 27.2s	remaining: 2.22s
245:	learn: 0.0095002	total: 27.4s	remaining: 2.12s
246:	learn: 0.0094149	total: 27.6s	remaining: 2.01s
247:	learn: 0.0093781	total: 27.8s	remaining: 1.9s
248:	learn: 0.0093533	total: 27.9s	remaining: 1.79s
249:	learn: 0.0092794	total: 28.1s	remaining: 1.69s
250:	learn: 0.0092200	total: 28.3s	remaining: 1.58s
251:	learn: 0.0091817	total: 28.5s	remaining: 1.47s
252:	learn: 0.0091128	total: 28.7s	remaining: 1.36s
253:	learn: 0.0090556	total: 28.8s	remaining: 1.25s
254:	learn: 0.0090033	total: 29s	remaining: 1.14s
255:	learn: 0.0089498	total: 29.2s	remaining: 1.03s
256:	learn: 0.0088995	total: 29.4s	remaining: 914ms
257:	learn: 0.0088781	total: 29.5s	remaining: 801ms
258:	learn: 0.0088116	total: 29.7s	remaining: 689ms
259:	learn: 0.0087645	total: 29.9s	remaining: 575ms
260:	learn: 0.0087184	total: 30.1s	remaining: 461ms
261:	learn: 0.0086956	total: 30.3s	remaining: 346ms
262:	learn: 0.0086673	total: 30.4s	remaining: 231ms
263:	learn: 0.0086019	total: 30.6s	remaining: 116ms
264:	learn: 0.0085591	total: 30.8s	remaining: 0us
[I 2024-12-19 14:47:20,616] Trial 32 finished with value: 72.7404439438197 and parameters: {'learning_rate': 0.025573344355910715, 'max_depth': 6, 'n_estimators': 265, 'scale_pos_weight': 7.214464450742685}. Best is trial 17 with value: 76.82995176096074.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 98.83
 - Recall_Train: 100.00
 - AUPRC_Train: 99.97
 - Accuracy_Train: 99.41
 - F1-Score_Train: 99.41
 - Precision_Test: 11.29
 - Recall_Test: 88.10
 - AUPRC_Test: 73.64
 - Accuracy_Test: 98.82
 - F1-Score_Test: 20.02
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 265
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.03
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 7.21
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 72.7404

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4619808	total: 90.8ms	remaining: 26.3s
1:	learn: 0.3221746	total: 185ms	remaining: 26.7s
2:	learn: 0.2222340	total: 275ms	remaining: 26.4s
3:	learn: 0.1647534	total: 389ms	remaining: 27.9s
4:	learn: 0.1286910	total: 480ms	remaining: 27.5s
5:	learn: 0.1103094	total: 571ms	remaining: 27.1s
6:	learn: 0.0895204	total: 697ms	remaining: 28.3s
7:	learn: 0.0790815	total: 786ms	remaining: 27.8s
8:	learn: 0.0706291	total: 874ms	remaining: 27.4s
9:	learn: 0.0616765	total: 981ms	remaining: 27.6s
10:	learn: 0.0553723	total: 1.07s	remaining: 27.2s
11:	learn: 0.0519555	total: 1.16s	remaining: 27s
12:	learn: 0.0490402	total: 1.27s	remaining: 27.3s
13:	learn: 0.0457589	total: 1.36s	remaining: 26.9s
14:	learn: 0.0430718	total: 1.45s	remaining: 26.6s
15:	learn: 0.0398658	total: 1.56s	remaining: 26.8s
16:	learn: 0.0378252	total: 1.66s	remaining: 26.8s
17:	learn: 0.0357427	total: 1.75s	remaining: 26.6s
18:	learn: 0.0342122	total: 1.86s	remaining: 26.6s
19:	learn: 0.0329424	total: 1.94s	remaining: 26.3s
20:	learn: 0.0313457	total: 2.03s	remaining: 26.1s
21:	learn: 0.0301216	total: 2.13s	remaining: 26.1s
22:	learn: 0.0284769	total: 2.23s	remaining: 26s
23:	learn: 0.0275256	total: 2.31s	remaining: 25.8s
24:	learn: 0.0263086	total: 2.42s	remaining: 25.8s
25:	learn: 0.0249874	total: 2.51s	remaining: 25.6s
26:	learn: 0.0238319	total: 2.6s	remaining: 25.4s
27:	learn: 0.0229264	total: 2.71s	remaining: 25.5s
28:	learn: 0.0218358	total: 2.81s	remaining: 25.4s
29:	learn: 0.0209930	total: 2.91s	remaining: 25.3s
30:	learn: 0.0202596	total: 3.01s	remaining: 25.3s
31:	learn: 0.0194940	total: 3.1s	remaining: 25.1s
32:	learn: 0.0187590	total: 3.19s	remaining: 24.9s
33:	learn: 0.0181356	total: 3.3s	remaining: 24.9s
34:	learn: 0.0175597	total: 3.42s	remaining: 25s
35:	learn: 0.0170275	total: 3.52s	remaining: 24.9s
36:	learn: 0.0165808	total: 3.61s	remaining: 24.8s
37:	learn: 0.0159646	total: 3.71s	remaining: 24.7s
38:	learn: 0.0154886	total: 3.85s	remaining: 24.9s
39:	learn: 0.0149744	total: 3.94s	remaining: 24.7s
40:	learn: 0.0142652	total: 4.03s	remaining: 24.6s
41:	learn: 0.0138246	total: 4.14s	remaining: 24.5s
42:	learn: 0.0134959	total: 4.22s	remaining: 24.4s
43:	learn: 0.0131969	total: 4.32s	remaining: 24.2s
44:	learn: 0.0126953	total: 4.43s	remaining: 24.2s
45:	learn: 0.0122996	total: 4.52s	remaining: 24.1s
46:	learn: 0.0119544	total: 4.6s	remaining: 23.9s
47:	learn: 0.0117158	total: 4.71s	remaining: 23.8s
48:	learn: 0.0114165	total: 4.82s	remaining: 23.8s
49:	learn: 0.0112200	total: 4.9s	remaining: 23.6s
50:	learn: 0.0108327	total: 5.01s	remaining: 23.6s
51:	learn: 0.0105743	total: 5.1s	remaining: 23.4s
52:	learn: 0.0102383	total: 5.19s	remaining: 23.3s
53:	learn: 0.0099253	total: 5.3s	remaining: 23.3s
54:	learn: 0.0097085	total: 5.38s	remaining: 23.1s
55:	learn: 0.0092764	total: 5.47s	remaining: 23s
56:	learn: 0.0090955	total: 5.58s	remaining: 22.9s
57:	learn: 0.0088592	total: 5.66s	remaining: 22.8s
58:	learn: 0.0087084	total: 5.75s	remaining: 22.6s
59:	learn: 0.0084906	total: 5.87s	remaining: 22.6s
60:	learn: 0.0081814	total: 5.96s	remaining: 22.5s
61:	learn: 0.0080563	total: 6.04s	remaining: 22.3s
62:	learn: 0.0079044	total: 6.14s	remaining: 22.2s
63:	learn: 0.0077371	total: 6.24s	remaining: 22.1s
64:	learn: 0.0076767	total: 6.33s	remaining: 22s
65:	learn: 0.0074836	total: 6.43s	remaining: 21.9s
66:	learn: 0.0072595	total: 6.52s	remaining: 21.8s
67:	learn: 0.0071523	total: 6.61s	remaining: 21.7s
68:	learn: 0.0069571	total: 6.73s	remaining: 21.6s
69:	learn: 0.0068632	total: 6.83s	remaining: 21.6s
70:	learn: 0.0067449	total: 6.91s	remaining: 21.4s
71:	learn: 0.0065182	total: 7.03s	remaining: 21.4s
72:	learn: 0.0064147	total: 7.12s	remaining: 21.2s
73:	learn: 0.0062902	total: 7.2s	remaining: 21.1s
74:	learn: 0.0062342	total: 7.3s	remaining: 21s
75:	learn: 0.0061091	total: 7.38s	remaining: 20.9s
76:	learn: 0.0060064	total: 7.47s	remaining: 20.8s
77:	learn: 0.0058489	total: 7.59s	remaining: 20.7s
78:	learn: 0.0057913	total: 7.67s	remaining: 20.6s
79:	learn: 0.0057329	total: 7.82s	remaining: 20.6s
80:	learn: 0.0056221	total: 7.97s	remaining: 20.7s
81:	learn: 0.0055676	total: 8.16s	remaining: 20.8s
82:	learn: 0.0054779	total: 8.32s	remaining: 20.8s
83:	learn: 0.0054061	total: 8.48s	remaining: 20.9s
84:	learn: 0.0053363	total: 8.66s	remaining: 21s
85:	learn: 0.0052482	total: 8.84s	remaining: 21.1s
86:	learn: 0.0051350	total: 9.02s	remaining: 21.2s
87:	learn: 0.0050767	total: 9.19s	remaining: 21.2s
88:	learn: 0.0050119	total: 9.34s	remaining: 21.2s
89:	learn: 0.0049790	total: 9.49s	remaining: 21.2s
90:	learn: 0.0048207	total: 9.66s	remaining: 21.2s
91:	learn: 0.0047725	total: 9.82s	remaining: 21.2s
92:	learn: 0.0047080	total: 9.99s	remaining: 21.3s
93:	learn: 0.0046499	total: 10.2s	remaining: 21.3s
94:	learn: 0.0045766	total: 10.3s	remaining: 21.4s
95:	learn: 0.0045256	total: 10.5s	remaining: 21.4s
96:	learn: 0.0044476	total: 10.7s	remaining: 21.4s
97:	learn: 0.0044033	total: 10.9s	remaining: 21.4s
98:	learn: 0.0043413	total: 11s	remaining: 21.4s
99:	learn: 0.0043058	total: 11.2s	remaining: 21.4s
100:	learn: 0.0042771	total: 11.4s	remaining: 21.4s
101:	learn: 0.0042413	total: 11.5s	remaining: 21.3s
102:	learn: 0.0041985	total: 11.7s	remaining: 21.3s
103:	learn: 0.0041985	total: 11.8s	remaining: 21.2s
104:	learn: 0.0041740	total: 12s	remaining: 21.2s
105:	learn: 0.0041403	total: 12.2s	remaining: 21.3s
106:	learn: 0.0041004	total: 12.3s	remaining: 21.2s
107:	learn: 0.0040004	total: 12.5s	remaining: 21.2s
108:	learn: 0.0040004	total: 12.6s	remaining: 21.1s
109:	learn: 0.0039591	total: 12.8s	remaining: 21.1s
110:	learn: 0.0038994	total: 13s	remaining: 21.1s
111:	learn: 0.0038402	total: 13.2s	remaining: 21.1s
112:	learn: 0.0038104	total: 13.4s	remaining: 21s
113:	learn: 0.0037198	total: 13.5s	remaining: 21s
114:	learn: 0.0036234	total: 13.7s	remaining: 21s
115:	learn: 0.0036234	total: 13.9s	remaining: 20.9s
116:	learn: 0.0035843	total: 14s	remaining: 20.8s
117:	learn: 0.0035510	total: 14.2s	remaining: 20.8s
118:	learn: 0.0035289	total: 14.3s	remaining: 20.7s
119:	learn: 0.0035033	total: 14.5s	remaining: 20.7s
120:	learn: 0.0034962	total: 14.7s	remaining: 20.6s
121:	learn: 0.0034575	total: 14.8s	remaining: 20.5s
122:	learn: 0.0034182	total: 15s	remaining: 20.5s
123:	learn: 0.0033520	total: 15.2s	remaining: 20.4s
124:	learn: 0.0033383	total: 15.3s	remaining: 20.3s
125:	learn: 0.0033145	total: 15.5s	remaining: 20.3s
126:	learn: 0.0033145	total: 15.6s	remaining: 20.2s
127:	learn: 0.0032980	total: 15.8s	remaining: 20.1s
128:	learn: 0.0032980	total: 15.9s	remaining: 20s
129:	learn: 0.0032980	total: 16s	remaining: 19.9s
130:	learn: 0.0032807	total: 16.2s	remaining: 19.8s
131:	learn: 0.0032580	total: 16.4s	remaining: 19.8s
132:	learn: 0.0032169	total: 16.6s	remaining: 19.7s
133:	learn: 0.0032169	total: 16.7s	remaining: 19.6s
134:	learn: 0.0032169	total: 16.8s	remaining: 19.4s
135:	learn: 0.0032169	total: 17s	remaining: 19.3s
136:	learn: 0.0032169	total: 17.1s	remaining: 19.2s
137:	learn: 0.0031790	total: 17.3s	remaining: 19.1s
138:	learn: 0.0031790	total: 17.4s	remaining: 19s
139:	learn: 0.0031790	total: 17.5s	remaining: 18.9s
140:	learn: 0.0031790	total: 17.7s	remaining: 18.8s
141:	learn: 0.0031790	total: 17.8s	remaining: 18.7s
142:	learn: 0.0031790	total: 17.9s	remaining: 18.6s
143:	learn: 0.0031790	total: 18.1s	remaining: 18.5s
144:	learn: 0.0031790	total: 18.2s	remaining: 18.3s
145:	learn: 0.0031790	total: 18.3s	remaining: 18.2s
146:	learn: 0.0031515	total: 18.5s	remaining: 18.1s
147:	learn: 0.0031359	total: 18.7s	remaining: 18s
148:	learn: 0.0031359	total: 18.7s	remaining: 17.9s
149:	learn: 0.0031359	total: 18.8s	remaining: 17.7s
150:	learn: 0.0031359	total: 18.9s	remaining: 17.5s
151:	learn: 0.0031359	total: 18.9s	remaining: 17.3s
152:	learn: 0.0031358	total: 19s	remaining: 17.1s
153:	learn: 0.0031358	total: 19.1s	remaining: 17s
154:	learn: 0.0031358	total: 19.1s	remaining: 16.8s
155:	learn: 0.0031162	total: 19.2s	remaining: 16.6s
156:	learn: 0.0030796	total: 19.3s	remaining: 16.5s
157:	learn: 0.0030537	total: 19.4s	remaining: 16.3s
158:	learn: 0.0030411	total: 19.5s	remaining: 16.2s
159:	learn: 0.0030411	total: 19.6s	remaining: 16s
160:	learn: 0.0030411	total: 19.6s	remaining: 15.9s
161:	learn: 0.0030410	total: 19.7s	remaining: 15.7s
162:	learn: 0.0029979	total: 19.8s	remaining: 15.6s
163:	learn: 0.0029979	total: 19.9s	remaining: 15.4s
164:	learn: 0.0029979	total: 20s	remaining: 15.3s
165:	learn: 0.0029979	total: 20.1s	remaining: 15.1s
166:	learn: 0.0029869	total: 20.1s	remaining: 15s
167:	learn: 0.0029512	total: 20.2s	remaining: 14.8s
168:	learn: 0.0029371	total: 20.3s	remaining: 14.7s
169:	learn: 0.0029029	total: 20.4s	remaining: 14.5s
170:	learn: 0.0029028	total: 20.5s	remaining: 14.4s
171:	learn: 0.0029028	total: 20.6s	remaining: 14.2s
172:	learn: 0.0029028	total: 20.6s	remaining: 14.1s
173:	learn: 0.0029028	total: 20.7s	remaining: 13.9s
174:	learn: 0.0029028	total: 20.8s	remaining: 13.8s
175:	learn: 0.0029028	total: 20.9s	remaining: 13.6s
176:	learn: 0.0029028	total: 21s	remaining: 13.5s
177:	learn: 0.0029028	total: 21s	remaining: 13.4s
178:	learn: 0.0029028	total: 21.1s	remaining: 13.2s
179:	learn: 0.0029028	total: 21.2s	remaining: 13.1s
180:	learn: 0.0029028	total: 21.3s	remaining: 12.9s
181:	learn: 0.0029028	total: 21.3s	remaining: 12.8s
182:	learn: 0.0029028	total: 21.4s	remaining: 12.6s
183:	learn: 0.0029028	total: 21.5s	remaining: 12.5s
184:	learn: 0.0029028	total: 21.5s	remaining: 12.3s
185:	learn: 0.0029028	total: 21.6s	remaining: 12.2s
186:	learn: 0.0029028	total: 21.7s	remaining: 12.1s
187:	learn: 0.0029028	total: 21.8s	remaining: 11.9s
188:	learn: 0.0029028	total: 21.9s	remaining: 11.8s
189:	learn: 0.0029028	total: 22s	remaining: 11.7s
190:	learn: 0.0029028	total: 22s	remaining: 11.5s
191:	learn: 0.0029028	total: 22.1s	remaining: 11.4s
192:	learn: 0.0029028	total: 22.2s	remaining: 11.3s
193:	learn: 0.0029028	total: 22.2s	remaining: 11.1s
194:	learn: 0.0029028	total: 22.3s	remaining: 11s
195:	learn: 0.0028875	total: 22.4s	remaining: 10.9s
196:	learn: 0.0028875	total: 22.5s	remaining: 10.7s
197:	learn: 0.0028874	total: 22.6s	remaining: 10.6s
198:	learn: 0.0028874	total: 22.7s	remaining: 10.5s
199:	learn: 0.0028874	total: 22.7s	remaining: 10.3s
200:	learn: 0.0028874	total: 22.8s	remaining: 10.2s
201:	learn: 0.0028874	total: 22.9s	remaining: 10.1s
202:	learn: 0.0028873	total: 23s	remaining: 9.96s
203:	learn: 0.0028873	total: 23.1s	remaining: 9.83s
204:	learn: 0.0028873	total: 23.1s	remaining: 9.7s
205:	learn: 0.0028873	total: 23.2s	remaining: 9.57s
206:	learn: 0.0028873	total: 23.3s	remaining: 9.45s
207:	learn: 0.0028873	total: 23.4s	remaining: 9.32s
208:	learn: 0.0028872	total: 23.4s	remaining: 9.19s
209:	learn: 0.0028872	total: 23.5s	remaining: 9.07s
210:	learn: 0.0028769	total: 23.6s	remaining: 8.95s
211:	learn: 0.0028769	total: 23.7s	remaining: 8.82s
212:	learn: 0.0028769	total: 23.8s	remaining: 8.71s
213:	learn: 0.0028769	total: 23.9s	remaining: 8.58s
214:	learn: 0.0028617	total: 23.9s	remaining: 8.46s
215:	learn: 0.0028617	total: 24s	remaining: 8.33s
216:	learn: 0.0028617	total: 24.1s	remaining: 8.21s
217:	learn: 0.0028617	total: 24.1s	remaining: 8.09s
218:	learn: 0.0028617	total: 24.2s	remaining: 7.97s
219:	learn: 0.0028617	total: 24.3s	remaining: 7.85s
220:	learn: 0.0028617	total: 24.4s	remaining: 7.74s
221:	learn: 0.0028617	total: 24.6s	remaining: 7.63s
222:	learn: 0.0028617	total: 24.7s	remaining: 7.53s
223:	learn: 0.0028617	total: 24.8s	remaining: 7.43s
224:	learn: 0.0028617	total: 25s	remaining: 7.33s
225:	learn: 0.0028617	total: 25.1s	remaining: 7.22s
226:	learn: 0.0028617	total: 25.3s	remaining: 7.12s
227:	learn: 0.0028617	total: 25.4s	remaining: 7.01s
228:	learn: 0.0028617	total: 25.5s	remaining: 6.91s
229:	learn: 0.0028617	total: 25.7s	remaining: 6.8s
230:	learn: 0.0028617	total: 25.8s	remaining: 6.7s
231:	learn: 0.0028617	total: 25.9s	remaining: 6.59s
232:	learn: 0.0028617	total: 26.1s	remaining: 6.49s
233:	learn: 0.0028617	total: 26.2s	remaining: 6.38s
234:	learn: 0.0028617	total: 26.3s	remaining: 6.27s
235:	learn: 0.0028617	total: 26.4s	remaining: 6.16s
236:	learn: 0.0028617	total: 26.6s	remaining: 6.05s
237:	learn: 0.0028617	total: 26.7s	remaining: 5.95s
238:	learn: 0.0028617	total: 26.9s	remaining: 5.84s
239:	learn: 0.0028617	total: 27s	remaining: 5.74s
240:	learn: 0.0028617	total: 27.1s	remaining: 5.63s
241:	learn: 0.0028617	total: 27.3s	remaining: 5.52s
242:	learn: 0.0028617	total: 27.4s	remaining: 5.41s
243:	learn: 0.0028617	total: 27.6s	remaining: 5.31s
244:	learn: 0.0028617	total: 27.7s	remaining: 5.2s
245:	learn: 0.0028617	total: 27.8s	remaining: 5.09s
246:	learn: 0.0028617	total: 28s	remaining: 4.98s
247:	learn: 0.0028617	total: 28.1s	remaining: 4.87s
248:	learn: 0.0028617	total: 28.3s	remaining: 4.77s
249:	learn: 0.0028617	total: 28.4s	remaining: 4.66s
250:	learn: 0.0028617	total: 28.5s	remaining: 4.55s
251:	learn: 0.0028617	total: 28.7s	remaining: 4.44s
252:	learn: 0.0028617	total: 28.8s	remaining: 4.33s
253:	learn: 0.0028617	total: 28.9s	remaining: 4.21s
254:	learn: 0.0028617	total: 29.1s	remaining: 4.11s
255:	learn: 0.0028617	total: 29.2s	remaining: 4s
256:	learn: 0.0028617	total: 29.4s	remaining: 3.89s
257:	learn: 0.0028617	total: 29.5s	remaining: 3.77s
258:	learn: 0.0028617	total: 29.6s	remaining: 3.66s
259:	learn: 0.0028617	total: 29.8s	remaining: 3.55s
260:	learn: 0.0028617	total: 29.9s	remaining: 3.44s
261:	learn: 0.0028617	total: 30s	remaining: 3.32s
262:	learn: 0.0028617	total: 30.1s	remaining: 3.21s
263:	learn: 0.0028617	total: 30.2s	remaining: 3.09s
264:	learn: 0.0028617	total: 30.3s	remaining: 2.97s
265:	learn: 0.0028617	total: 30.3s	remaining: 2.85s
266:	learn: 0.0028617	total: 30.4s	remaining: 2.73s
267:	learn: 0.0028617	total: 30.5s	remaining: 2.62s
268:	learn: 0.0028617	total: 30.6s	remaining: 2.5s
269:	learn: 0.0028617	total: 30.6s	remaining: 2.38s
270:	learn: 0.0028617	total: 30.7s	remaining: 2.27s
271:	learn: 0.0028617	total: 30.8s	remaining: 2.15s
272:	learn: 0.0028617	total: 30.9s	remaining: 2.04s
273:	learn: 0.0028617	total: 30.9s	remaining: 1.92s
274:	learn: 0.0028617	total: 31s	remaining: 1.8s
275:	learn: 0.0028617	total: 31.1s	remaining: 1.69s
276:	learn: 0.0028617	total: 31.2s	remaining: 1.57s
277:	learn: 0.0028617	total: 31.2s	remaining: 1.46s
278:	learn: 0.0028617	total: 31.3s	remaining: 1.35s
279:	learn: 0.0028617	total: 31.4s	remaining: 1.23s
280:	learn: 0.0028617	total: 31.5s	remaining: 1.12s
281:	learn: 0.0028617	total: 31.5s	remaining: 1.01s
282:	learn: 0.0028617	total: 31.6s	remaining: 894ms
283:	learn: 0.0028617	total: 31.7s	remaining: 781ms
284:	learn: 0.0028617	total: 31.8s	remaining: 669ms
285:	learn: 0.0028617	total: 31.8s	remaining: 557ms
286:	learn: 0.0028617	total: 31.9s	remaining: 445ms
287:	learn: 0.0028617	total: 32s	remaining: 333ms
288:	learn: 0.0028617	total: 32.1s	remaining: 222ms
289:	learn: 0.0028617	total: 32.2s	remaining: 111ms
290:	learn: 0.0028617	total: 32.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.76
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 29.32
 - Recall_Test: 84.92
 - AUPRC_Test: 81.08
 - Accuracy_Test: 99.63
 - F1-Score_Test: 43.58
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.28
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4820127	total: 97.9ms	remaining: 28.4s
1:	learn: 0.3238883	total: 196ms	remaining: 28.3s
2:	learn: 0.2345617	total: 281ms	remaining: 27s
3:	learn: 0.1899968	total: 382ms	remaining: 27.4s
4:	learn: 0.1582095	total: 472ms	remaining: 27s
5:	learn: 0.1388043	total: 559ms	remaining: 26.6s
6:	learn: 0.1194543	total: 664ms	remaining: 26.9s
7:	learn: 0.1072456	total: 754ms	remaining: 26.7s
8:	learn: 0.0952980	total: 869ms	remaining: 27.2s
9:	learn: 0.0871401	total: 974ms	remaining: 27.4s
10:	learn: 0.0825588	total: 1.06s	remaining: 27.1s
11:	learn: 0.0768407	total: 1.17s	remaining: 27.3s
12:	learn: 0.0727934	total: 1.26s	remaining: 27s
13:	learn: 0.0676995	total: 1.35s	remaining: 26.8s
14:	learn: 0.0644438	total: 1.46s	remaining: 26.9s
15:	learn: 0.0611213	total: 1.55s	remaining: 26.7s
16:	learn: 0.0580108	total: 1.64s	remaining: 26.4s
17:	learn: 0.0546402	total: 1.75s	remaining: 26.5s
18:	learn: 0.0526563	total: 1.83s	remaining: 26.2s
19:	learn: 0.0500540	total: 1.94s	remaining: 26.3s
20:	learn: 0.0476218	total: 2.06s	remaining: 26.4s
21:	learn: 0.0456205	total: 2.14s	remaining: 26.2s
22:	learn: 0.0435645	total: 2.23s	remaining: 26s
23:	learn: 0.0409445	total: 2.34s	remaining: 26s
24:	learn: 0.0385385	total: 2.43s	remaining: 25.9s
25:	learn: 0.0370780	total: 2.52s	remaining: 25.7s
26:	learn: 0.0352839	total: 2.63s	remaining: 25.7s
27:	learn: 0.0345270	total: 2.71s	remaining: 25.5s
28:	learn: 0.0334932	total: 2.79s	remaining: 25.3s
29:	learn: 0.0322247	total: 2.94s	remaining: 25.6s
30:	learn: 0.0312988	total: 3.02s	remaining: 25.3s
31:	learn: 0.0303834	total: 3.11s	remaining: 25.1s
32:	learn: 0.0296178	total: 3.21s	remaining: 25.1s
33:	learn: 0.0283173	total: 3.31s	remaining: 25s
34:	learn: 0.0274018	total: 3.4s	remaining: 24.8s
35:	learn: 0.0262776	total: 3.5s	remaining: 24.8s
36:	learn: 0.0257971	total: 3.58s	remaining: 24.6s
37:	learn: 0.0250457	total: 3.66s	remaining: 24.4s
38:	learn: 0.0244608	total: 3.77s	remaining: 24.3s
39:	learn: 0.0234668	total: 3.85s	remaining: 24.2s
40:	learn: 0.0230473	total: 3.95s	remaining: 24.1s
41:	learn: 0.0222742	total: 4.06s	remaining: 24.1s
42:	learn: 0.0214772	total: 4.15s	remaining: 23.9s
43:	learn: 0.0209653	total: 4.24s	remaining: 23.8s
44:	learn: 0.0205746	total: 4.34s	remaining: 23.7s
45:	learn: 0.0200491	total: 4.42s	remaining: 23.5s
46:	learn: 0.0194605	total: 4.5s	remaining: 23.4s
47:	learn: 0.0189385	total: 4.61s	remaining: 23.3s
48:	learn: 0.0182334	total: 4.7s	remaining: 23.2s
49:	learn: 0.0177196	total: 4.78s	remaining: 23.1s
50:	learn: 0.0172767	total: 4.89s	remaining: 23s
51:	learn: 0.0168559	total: 4.99s	remaining: 22.9s
52:	learn: 0.0165860	total: 5.08s	remaining: 22.8s
53:	learn: 0.0163038	total: 5.19s	remaining: 22.8s
54:	learn: 0.0159554	total: 5.27s	remaining: 22.6s
55:	learn: 0.0154907	total: 5.36s	remaining: 22.5s
56:	learn: 0.0150855	total: 5.46s	remaining: 22.4s
57:	learn: 0.0147472	total: 5.55s	remaining: 22.3s
58:	learn: 0.0145551	total: 5.63s	remaining: 22.1s
59:	learn: 0.0142253	total: 5.77s	remaining: 22.2s
60:	learn: 0.0139588	total: 5.91s	remaining: 22.3s
61:	learn: 0.0136595	total: 6.08s	remaining: 22.5s
62:	learn: 0.0134102	total: 6.24s	remaining: 22.6s
63:	learn: 0.0131697	total: 6.41s	remaining: 22.7s
64:	learn: 0.0127822	total: 6.58s	remaining: 22.9s
65:	learn: 0.0125749	total: 6.76s	remaining: 23s
66:	learn: 0.0122655	total: 6.94s	remaining: 23.2s
67:	learn: 0.0119410	total: 7.12s	remaining: 23.4s
68:	learn: 0.0117192	total: 7.3s	remaining: 23.5s
69:	learn: 0.0115011	total: 7.47s	remaining: 23.6s
70:	learn: 0.0112289	total: 7.64s	remaining: 23.7s
71:	learn: 0.0109385	total: 7.82s	remaining: 23.8s
72:	learn: 0.0106606	total: 7.98s	remaining: 23.8s
73:	learn: 0.0104150	total: 8.17s	remaining: 24s
74:	learn: 0.0103508	total: 8.34s	remaining: 24s
75:	learn: 0.0100531	total: 8.53s	remaining: 24.1s
76:	learn: 0.0098284	total: 8.7s	remaining: 24.2s
77:	learn: 0.0095914	total: 8.88s	remaining: 24.3s
78:	learn: 0.0094145	total: 9.06s	remaining: 24.3s
79:	learn: 0.0091474	total: 9.24s	remaining: 24.4s
80:	learn: 0.0090780	total: 9.39s	remaining: 24.3s
81:	learn: 0.0088396	total: 9.56s	remaining: 24.4s
82:	learn: 0.0087876	total: 9.68s	remaining: 24.3s
83:	learn: 0.0085934	total: 9.88s	remaining: 24.3s
84:	learn: 0.0083993	total: 10s	remaining: 24.3s
85:	learn: 0.0081535	total: 10.2s	remaining: 24.4s
86:	learn: 0.0079970	total: 10.4s	remaining: 24.4s
87:	learn: 0.0077992	total: 10.6s	remaining: 24.4s
88:	learn: 0.0076844	total: 10.7s	remaining: 24.4s
89:	learn: 0.0075777	total: 10.9s	remaining: 24.4s
90:	learn: 0.0074585	total: 11.1s	remaining: 24.3s
91:	learn: 0.0073761	total: 11.2s	remaining: 24.2s
92:	learn: 0.0072722	total: 11.3s	remaining: 24s
93:	learn: 0.0071201	total: 11.4s	remaining: 23.8s
94:	learn: 0.0070259	total: 11.5s	remaining: 23.7s
95:	learn: 0.0069299	total: 11.6s	remaining: 23.5s
96:	learn: 0.0068260	total: 11.6s	remaining: 23.3s
97:	learn: 0.0066897	total: 11.8s	remaining: 23.2s
98:	learn: 0.0066169	total: 11.8s	remaining: 23s
99:	learn: 0.0065457	total: 11.9s	remaining: 22.8s
100:	learn: 0.0064795	total: 12.1s	remaining: 22.7s
101:	learn: 0.0064601	total: 12.1s	remaining: 22.5s
102:	learn: 0.0064109	total: 12.2s	remaining: 22.3s
103:	learn: 0.0063787	total: 12.3s	remaining: 22.2s
104:	learn: 0.0062298	total: 12.4s	remaining: 22s
105:	learn: 0.0061444	total: 12.5s	remaining: 21.8s
106:	learn: 0.0060999	total: 12.6s	remaining: 21.7s
107:	learn: 0.0060513	total: 12.7s	remaining: 21.5s
108:	learn: 0.0060126	total: 12.8s	remaining: 21.3s
109:	learn: 0.0058895	total: 12.9s	remaining: 21.2s
110:	learn: 0.0058314	total: 13s	remaining: 21s
111:	learn: 0.0057463	total: 13.1s	remaining: 20.9s
112:	learn: 0.0056576	total: 13.2s	remaining: 20.7s
113:	learn: 0.0056100	total: 13.3s	remaining: 20.6s
114:	learn: 0.0055547	total: 13.3s	remaining: 20.4s
115:	learn: 0.0054955	total: 13.5s	remaining: 20.3s
116:	learn: 0.0054517	total: 13.5s	remaining: 20.1s
117:	learn: 0.0054177	total: 13.6s	remaining: 20s
118:	learn: 0.0053903	total: 13.7s	remaining: 19.8s
119:	learn: 0.0053357	total: 13.8s	remaining: 19.7s
120:	learn: 0.0052429	total: 13.9s	remaining: 19.5s
121:	learn: 0.0052013	total: 14s	remaining: 19.4s
122:	learn: 0.0051220	total: 14.1s	remaining: 19.2s
123:	learn: 0.0050599	total: 14.2s	remaining: 19.1s
124:	learn: 0.0050180	total: 14.3s	remaining: 19s
125:	learn: 0.0049481	total: 14.4s	remaining: 18.8s
126:	learn: 0.0049201	total: 14.5s	remaining: 18.7s
127:	learn: 0.0047749	total: 14.6s	remaining: 18.6s
128:	learn: 0.0047545	total: 14.7s	remaining: 18.4s
129:	learn: 0.0047194	total: 14.7s	remaining: 18.3s
130:	learn: 0.0046051	total: 14.9s	remaining: 18.1s
131:	learn: 0.0045537	total: 14.9s	remaining: 18s
132:	learn: 0.0045537	total: 15s	remaining: 17.8s
133:	learn: 0.0044626	total: 15.1s	remaining: 17.7s
134:	learn: 0.0044177	total: 15.2s	remaining: 17.6s
135:	learn: 0.0043813	total: 15.3s	remaining: 17.4s
136:	learn: 0.0043549	total: 15.4s	remaining: 17.3s
137:	learn: 0.0042892	total: 15.5s	remaining: 17.2s
138:	learn: 0.0042077	total: 15.6s	remaining: 17s
139:	learn: 0.0041487	total: 15.7s	remaining: 16.9s
140:	learn: 0.0041243	total: 15.8s	remaining: 16.8s
141:	learn: 0.0040713	total: 15.9s	remaining: 16.6s
142:	learn: 0.0040178	total: 16s	remaining: 16.5s
143:	learn: 0.0039437	total: 16.1s	remaining: 16.4s
144:	learn: 0.0039122	total: 16.2s	remaining: 16.3s
145:	learn: 0.0038757	total: 16.3s	remaining: 16.2s
146:	learn: 0.0038069	total: 16.4s	remaining: 16s
147:	learn: 0.0038069	total: 16.4s	remaining: 15.9s
148:	learn: 0.0038069	total: 16.5s	remaining: 15.7s
149:	learn: 0.0037879	total: 16.6s	remaining: 15.6s
150:	learn: 0.0037599	total: 16.7s	remaining: 15.5s
151:	learn: 0.0037599	total: 16.8s	remaining: 15.3s
152:	learn: 0.0037598	total: 16.9s	remaining: 15.2s
153:	learn: 0.0037598	total: 16.9s	remaining: 15.1s
154:	learn: 0.0036791	total: 17s	remaining: 14.9s
155:	learn: 0.0036448	total: 17.1s	remaining: 14.8s
156:	learn: 0.0036448	total: 17.2s	remaining: 14.7s
157:	learn: 0.0035848	total: 17.3s	remaining: 14.6s
158:	learn: 0.0035848	total: 17.4s	remaining: 14.4s
159:	learn: 0.0035740	total: 17.5s	remaining: 14.3s
160:	learn: 0.0035740	total: 17.5s	remaining: 14.2s
161:	learn: 0.0035740	total: 17.6s	remaining: 14s
162:	learn: 0.0035740	total: 17.7s	remaining: 13.9s
163:	learn: 0.0035739	total: 17.8s	remaining: 13.8s
164:	learn: 0.0035739	total: 17.8s	remaining: 13.6s
165:	learn: 0.0035739	total: 17.9s	remaining: 13.5s
166:	learn: 0.0035739	total: 18s	remaining: 13.4s
167:	learn: 0.0035739	total: 18.1s	remaining: 13.2s
168:	learn: 0.0035739	total: 18.1s	remaining: 13.1s
169:	learn: 0.0035739	total: 18.2s	remaining: 13s
170:	learn: 0.0035739	total: 18.3s	remaining: 12.8s
171:	learn: 0.0035739	total: 18.4s	remaining: 12.7s
172:	learn: 0.0035739	total: 18.5s	remaining: 12.6s
173:	learn: 0.0035739	total: 18.5s	remaining: 12.5s
174:	learn: 0.0035654	total: 18.6s	remaining: 12.3s
175:	learn: 0.0035178	total: 18.7s	remaining: 12.2s
176:	learn: 0.0035083	total: 18.8s	remaining: 12.1s
177:	learn: 0.0035083	total: 18.9s	remaining: 12s
178:	learn: 0.0035082	total: 18.9s	remaining: 11.9s
179:	learn: 0.0035082	total: 19s	remaining: 11.7s
180:	learn: 0.0035082	total: 19.1s	remaining: 11.6s
181:	learn: 0.0035082	total: 19.2s	remaining: 11.5s
182:	learn: 0.0034910	total: 19.3s	remaining: 11.4s
183:	learn: 0.0034910	total: 19.4s	remaining: 11.3s
184:	learn: 0.0034909	total: 19.4s	remaining: 11.1s
185:	learn: 0.0034909	total: 19.5s	remaining: 11s
186:	learn: 0.0034909	total: 19.6s	remaining: 10.9s
187:	learn: 0.0034908	total: 19.7s	remaining: 10.8s
188:	learn: 0.0034908	total: 19.8s	remaining: 10.7s
189:	learn: 0.0034908	total: 19.8s	remaining: 10.5s
190:	learn: 0.0034908	total: 19.9s	remaining: 10.4s
191:	learn: 0.0034907	total: 20s	remaining: 10.3s
192:	learn: 0.0034907	total: 20.1s	remaining: 10.2s
193:	learn: 0.0034906	total: 20.1s	remaining: 10.1s
194:	learn: 0.0034906	total: 20.2s	remaining: 9.95s
195:	learn: 0.0034906	total: 20.3s	remaining: 9.83s
196:	learn: 0.0034906	total: 20.4s	remaining: 9.71s
197:	learn: 0.0034906	total: 20.4s	remaining: 9.6s
198:	learn: 0.0034906	total: 20.5s	remaining: 9.48s
199:	learn: 0.0034905	total: 20.6s	remaining: 9.38s
200:	learn: 0.0034905	total: 20.7s	remaining: 9.26s
201:	learn: 0.0034905	total: 20.7s	remaining: 9.14s
202:	learn: 0.0034905	total: 20.8s	remaining: 9.02s
203:	learn: 0.0034905	total: 20.9s	remaining: 8.91s
204:	learn: 0.0034905	total: 21s	remaining: 8.8s
205:	learn: 0.0034904	total: 21s	remaining: 8.68s
206:	learn: 0.0034904	total: 21.2s	remaining: 8.59s
207:	learn: 0.0034904	total: 21.3s	remaining: 8.49s
208:	learn: 0.0034904	total: 21.4s	remaining: 8.4s
209:	learn: 0.0034904	total: 21.5s	remaining: 8.31s
210:	learn: 0.0034904	total: 21.7s	remaining: 8.21s
211:	learn: 0.0034904	total: 21.8s	remaining: 8.13s
212:	learn: 0.0034903	total: 22s	remaining: 8.04s
213:	learn: 0.0034903	total: 22.1s	remaining: 7.95s
214:	learn: 0.0034904	total: 22.2s	remaining: 7.86s
215:	learn: 0.0034904	total: 22.4s	remaining: 7.78s
216:	learn: 0.0034903	total: 22.5s	remaining: 7.69s
217:	learn: 0.0034903	total: 22.7s	remaining: 7.6s
218:	learn: 0.0034903	total: 22.8s	remaining: 7.5s
219:	learn: 0.0034903	total: 23s	remaining: 7.41s
220:	learn: 0.0034902	total: 23.1s	remaining: 7.31s
221:	learn: 0.0034901	total: 23.2s	remaining: 7.22s
222:	learn: 0.0034901	total: 23.4s	remaining: 7.13s
223:	learn: 0.0034901	total: 23.5s	remaining: 7.04s
224:	learn: 0.0034901	total: 23.7s	remaining: 6.94s
225:	learn: 0.0034900	total: 23.8s	remaining: 6.84s
226:	learn: 0.0034900	total: 23.9s	remaining: 6.75s
227:	learn: 0.0034900	total: 24.1s	remaining: 6.66s
228:	learn: 0.0034899	total: 24.2s	remaining: 6.56s
229:	learn: 0.0034899	total: 24.4s	remaining: 6.46s
230:	learn: 0.0034899	total: 24.5s	remaining: 6.36s
231:	learn: 0.0034899	total: 24.6s	remaining: 6.26s
232:	learn: 0.0034899	total: 24.8s	remaining: 6.17s
233:	learn: 0.0034899	total: 24.9s	remaining: 6.07s
234:	learn: 0.0034663	total: 25.1s	remaining: 5.97s
235:	learn: 0.0034663	total: 25.2s	remaining: 5.87s
236:	learn: 0.0034663	total: 25.3s	remaining: 5.77s
237:	learn: 0.0034663	total: 25.5s	remaining: 5.67s
238:	learn: 0.0034662	total: 25.6s	remaining: 5.57s
239:	learn: 0.0034662	total: 25.7s	remaining: 5.47s
240:	learn: 0.0034662	total: 25.9s	remaining: 5.37s
241:	learn: 0.0034661	total: 26s	remaining: 5.27s
242:	learn: 0.0034661	total: 26.2s	remaining: 5.17s
243:	learn: 0.0034661	total: 26.3s	remaining: 5.06s
244:	learn: 0.0034661	total: 26.4s	remaining: 4.96s
245:	learn: 0.0034165	total: 26.6s	remaining: 4.87s
246:	learn: 0.0033777	total: 26.8s	remaining: 4.77s
247:	learn: 0.0033777	total: 26.9s	remaining: 4.66s
248:	learn: 0.0033604	total: 27s	remaining: 4.55s
249:	learn: 0.0033226	total: 27.1s	remaining: 4.44s
250:	learn: 0.0032961	total: 27.2s	remaining: 4.33s
251:	learn: 0.0032763	total: 27.3s	remaining: 4.22s
252:	learn: 0.0032287	total: 27.3s	remaining: 4.11s
253:	learn: 0.0032154	total: 27.4s	remaining: 4s
254:	learn: 0.0032154	total: 27.5s	remaining: 3.88s
255:	learn: 0.0031940	total: 27.6s	remaining: 3.77s
256:	learn: 0.0031653	total: 27.7s	remaining: 3.66s
257:	learn: 0.0031428	total: 27.8s	remaining: 3.55s
258:	learn: 0.0031428	total: 27.8s	remaining: 3.44s
259:	learn: 0.0031428	total: 27.9s	remaining: 3.33s
260:	learn: 0.0031428	total: 28s	remaining: 3.22s
261:	learn: 0.0031428	total: 28.1s	remaining: 3.11s
262:	learn: 0.0031427	total: 28.2s	remaining: 3s
263:	learn: 0.0031427	total: 28.3s	remaining: 2.89s
264:	learn: 0.0031261	total: 28.3s	remaining: 2.78s
265:	learn: 0.0030913	total: 28.4s	remaining: 2.67s
266:	learn: 0.0030652	total: 28.5s	remaining: 2.56s
267:	learn: 0.0030200	total: 28.6s	remaining: 2.45s
268:	learn: 0.0030200	total: 28.7s	remaining: 2.35s
269:	learn: 0.0029768	total: 28.8s	remaining: 2.24s
270:	learn: 0.0029768	total: 28.8s	remaining: 2.13s
271:	learn: 0.0029768	total: 28.9s	remaining: 2.02s
272:	learn: 0.0029767	total: 29s	remaining: 1.92s
273:	learn: 0.0029767	total: 29.1s	remaining: 1.81s
274:	learn: 0.0029766	total: 29.2s	remaining: 1.7s
275:	learn: 0.0029513	total: 29.3s	remaining: 1.59s
276:	learn: 0.0029513	total: 29.4s	remaining: 1.48s
277:	learn: 0.0029513	total: 29.4s	remaining: 1.38s
278:	learn: 0.0029514	total: 29.5s	remaining: 1.27s
279:	learn: 0.0029513	total: 29.6s	remaining: 1.16s
280:	learn: 0.0029513	total: 29.6s	remaining: 1.05s
281:	learn: 0.0029513	total: 29.8s	remaining: 950ms
282:	learn: 0.0029428	total: 29.8s	remaining: 843ms
283:	learn: 0.0029428	total: 29.9s	remaining: 737ms
284:	learn: 0.0029428	total: 30s	remaining: 631ms
285:	learn: 0.0029428	total: 30.1s	remaining: 525ms
286:	learn: 0.0029428	total: 30.1s	remaining: 420ms
287:	learn: 0.0029428	total: 30.2s	remaining: 315ms
288:	learn: 0.0029428	total: 30.3s	remaining: 210ms
289:	learn: 0.0029428	total: 30.4s	remaining: 105ms
290:	learn: 0.0029428	total: 30.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.78
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.89
 - F1-Score_Train: 99.89
 - Precision_Test: 35.37
 - Recall_Test: 87.30
 - AUPRC_Test: 79.86
 - Accuracy_Test: 99.71
 - F1-Score_Test: 50.34
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.28
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4353343	total: 85.5ms	remaining: 24.8s
1:	learn: 0.3184158	total: 173ms	remaining: 25s
2:	learn: 0.2415977	total: 253ms	remaining: 24.3s
3:	learn: 0.1844016	total: 359ms	remaining: 25.7s
4:	learn: 0.1466091	total: 452ms	remaining: 25.9s
5:	learn: 0.1247542	total: 555ms	remaining: 26.4s
6:	learn: 0.1096538	total: 671ms	remaining: 27.2s
7:	learn: 0.1006168	total: 752ms	remaining: 26.6s
8:	learn: 0.0922725	total: 836ms	remaining: 26.2s
9:	learn: 0.0843200	total: 949ms	remaining: 26.7s
10:	learn: 0.0764265	total: 1.05s	remaining: 26.6s
11:	learn: 0.0712285	total: 1.13s	remaining: 26.4s
12:	learn: 0.0671934	total: 1.25s	remaining: 26.6s
13:	learn: 0.0641271	total: 1.33s	remaining: 26.3s
14:	learn: 0.0585020	total: 1.42s	remaining: 26.1s
15:	learn: 0.0561488	total: 1.56s	remaining: 26.7s
16:	learn: 0.0541296	total: 1.64s	remaining: 26.4s
17:	learn: 0.0520634	total: 1.72s	remaining: 26.1s
18:	learn: 0.0492155	total: 1.82s	remaining: 26.1s
19:	learn: 0.0457199	total: 1.92s	remaining: 26s
20:	learn: 0.0444613	total: 2s	remaining: 25.7s
21:	learn: 0.0425910	total: 2.11s	remaining: 25.8s
22:	learn: 0.0401905	total: 2.2s	remaining: 25.6s
23:	learn: 0.0389485	total: 2.28s	remaining: 25.3s
24:	learn: 0.0373519	total: 2.4s	remaining: 25.6s
25:	learn: 0.0357565	total: 2.5s	remaining: 25.5s
26:	learn: 0.0341688	total: 2.6s	remaining: 25.4s
27:	learn: 0.0327165	total: 2.71s	remaining: 25.5s
28:	learn: 0.0312408	total: 2.81s	remaining: 25.4s
29:	learn: 0.0299210	total: 2.9s	remaining: 25.2s
30:	learn: 0.0288034	total: 3.01s	remaining: 25.2s
31:	learn: 0.0277042	total: 3.1s	remaining: 25.1s
32:	learn: 0.0266505	total: 3.18s	remaining: 24.9s
33:	learn: 0.0254776	total: 3.29s	remaining: 24.9s
34:	learn: 0.0245119	total: 3.38s	remaining: 24.7s
35:	learn: 0.0239877	total: 3.47s	remaining: 24.6s
36:	learn: 0.0233519	total: 3.58s	remaining: 24.6s
37:	learn: 0.0225090	total: 3.68s	remaining: 24.5s
38:	learn: 0.0214930	total: 3.77s	remaining: 24.4s
39:	learn: 0.0205895	total: 3.88s	remaining: 24.4s
40:	learn: 0.0202577	total: 3.96s	remaining: 24.1s
41:	learn: 0.0195218	total: 4.05s	remaining: 24s
42:	learn: 0.0186291	total: 4.19s	remaining: 24.2s
43:	learn: 0.0180513	total: 4.33s	remaining: 24.3s
44:	learn: 0.0175714	total: 4.5s	remaining: 24.6s
45:	learn: 0.0168959	total: 4.69s	remaining: 25s
46:	learn: 0.0165628	total: 4.87s	remaining: 25.3s
47:	learn: 0.0163245	total: 5.03s	remaining: 25.5s
48:	learn: 0.0159832	total: 5.19s	remaining: 25.6s
49:	learn: 0.0155459	total: 5.36s	remaining: 25.8s
50:	learn: 0.0150345	total: 5.53s	remaining: 26s
51:	learn: 0.0146191	total: 5.7s	remaining: 26.2s
52:	learn: 0.0142548	total: 5.88s	remaining: 26.4s
53:	learn: 0.0136923	total: 6.06s	remaining: 26.6s
54:	learn: 0.0132726	total: 6.21s	remaining: 26.7s
55:	learn: 0.0130199	total: 6.38s	remaining: 26.8s
56:	learn: 0.0126648	total: 6.57s	remaining: 27s
57:	learn: 0.0124822	total: 6.73s	remaining: 27.1s
58:	learn: 0.0122777	total: 6.9s	remaining: 27.1s
59:	learn: 0.0119170	total: 7.04s	remaining: 27.1s
60:	learn: 0.0115641	total: 7.23s	remaining: 27.3s
61:	learn: 0.0113916	total: 7.4s	remaining: 27.3s
62:	learn: 0.0110936	total: 7.59s	remaining: 27.5s
63:	learn: 0.0108203	total: 7.75s	remaining: 27.5s
64:	learn: 0.0106454	total: 7.92s	remaining: 27.5s
65:	learn: 0.0104053	total: 8.07s	remaining: 27.5s
66:	learn: 0.0101519	total: 8.23s	remaining: 27.5s
67:	learn: 0.0100800	total: 8.39s	remaining: 27.5s
68:	learn: 0.0099139	total: 8.55s	remaining: 27.5s
69:	learn: 0.0096727	total: 8.72s	remaining: 27.5s
70:	learn: 0.0095607	total: 8.89s	remaining: 27.5s
71:	learn: 0.0092902	total: 9.07s	remaining: 27.6s
72:	learn: 0.0089710	total: 9.26s	remaining: 27.6s
73:	learn: 0.0087743	total: 9.44s	remaining: 27.7s
74:	learn: 0.0086166	total: 9.63s	remaining: 27.7s
75:	learn: 0.0085716	total: 9.77s	remaining: 27.6s
76:	learn: 0.0083512	total: 9.94s	remaining: 27.6s
77:	learn: 0.0080604	total: 10s	remaining: 27.4s
78:	learn: 0.0078945	total: 10.1s	remaining: 27.2s
79:	learn: 0.0077431	total: 10.2s	remaining: 27s
80:	learn: 0.0074680	total: 10.3s	remaining: 26.8s
81:	learn: 0.0072610	total: 10.4s	remaining: 26.6s
82:	learn: 0.0072017	total: 10.5s	remaining: 26.4s
83:	learn: 0.0070407	total: 10.6s	remaining: 26.2s
84:	learn: 0.0069275	total: 10.7s	remaining: 26s
85:	learn: 0.0067896	total: 10.8s	remaining: 25.8s
86:	learn: 0.0067078	total: 10.9s	remaining: 25.6s
87:	learn: 0.0066384	total: 11s	remaining: 25.4s
88:	learn: 0.0065496	total: 11.1s	remaining: 25.2s
89:	learn: 0.0064714	total: 11.2s	remaining: 25s
90:	learn: 0.0064008	total: 11.3s	remaining: 24.8s
91:	learn: 0.0063193	total: 11.4s	remaining: 24.6s
92:	learn: 0.0062366	total: 11.5s	remaining: 24.4s
93:	learn: 0.0061386	total: 11.6s	remaining: 24.2s
94:	learn: 0.0060312	total: 11.7s	remaining: 24.1s
95:	learn: 0.0059782	total: 11.8s	remaining: 23.9s
96:	learn: 0.0058270	total: 11.8s	remaining: 23.7s
97:	learn: 0.0057433	total: 11.9s	remaining: 23.5s
98:	learn: 0.0055980	total: 12.1s	remaining: 23.4s
99:	learn: 0.0055470	total: 12.1s	remaining: 23.2s
100:	learn: 0.0054722	total: 12.2s	remaining: 23s
101:	learn: 0.0053464	total: 12.3s	remaining: 22.8s
102:	learn: 0.0052461	total: 12.4s	remaining: 22.7s
103:	learn: 0.0051435	total: 12.5s	remaining: 22.5s
104:	learn: 0.0050881	total: 12.6s	remaining: 22.3s
105:	learn: 0.0050152	total: 12.7s	remaining: 22.2s
106:	learn: 0.0049285	total: 12.8s	remaining: 22s
107:	learn: 0.0049003	total: 12.9s	remaining: 21.8s
108:	learn: 0.0048393	total: 13s	remaining: 21.6s
109:	learn: 0.0048206	total: 13.1s	remaining: 21.5s
110:	learn: 0.0047454	total: 13.2s	remaining: 21.4s
111:	learn: 0.0047237	total: 13.2s	remaining: 21.2s
112:	learn: 0.0046626	total: 13.3s	remaining: 21s
113:	learn: 0.0045788	total: 13.4s	remaining: 20.9s
114:	learn: 0.0045180	total: 13.6s	remaining: 20.7s
115:	learn: 0.0044682	total: 13.7s	remaining: 20.6s
116:	learn: 0.0044178	total: 13.7s	remaining: 20.4s
117:	learn: 0.0043930	total: 13.8s	remaining: 20.3s
118:	learn: 0.0043313	total: 13.9s	remaining: 20.1s
119:	learn: 0.0042954	total: 14s	remaining: 20s
120:	learn: 0.0042202	total: 14.1s	remaining: 19.9s
121:	learn: 0.0041527	total: 14.2s	remaining: 19.7s
122:	learn: 0.0041063	total: 14.3s	remaining: 19.5s
123:	learn: 0.0040713	total: 14.4s	remaining: 19.4s
124:	learn: 0.0040112	total: 14.5s	remaining: 19.3s
125:	learn: 0.0039855	total: 14.6s	remaining: 19.1s
126:	learn: 0.0039855	total: 14.7s	remaining: 19s
127:	learn: 0.0039257	total: 14.8s	remaining: 18.8s
128:	learn: 0.0038679	total: 14.9s	remaining: 18.7s
129:	learn: 0.0038679	total: 15s	remaining: 18.5s
130:	learn: 0.0038420	total: 15s	remaining: 18.4s
131:	learn: 0.0037599	total: 15.1s	remaining: 18.2s
132:	learn: 0.0037250	total: 15.3s	remaining: 18.1s
133:	learn: 0.0036887	total: 15.3s	remaining: 18s
134:	learn: 0.0036887	total: 15.4s	remaining: 17.8s
135:	learn: 0.0036376	total: 15.5s	remaining: 17.7s
136:	learn: 0.0036376	total: 15.6s	remaining: 17.5s
137:	learn: 0.0036376	total: 15.6s	remaining: 17.3s
138:	learn: 0.0036200	total: 15.7s	remaining: 17.2s
139:	learn: 0.0036200	total: 15.8s	remaining: 17s
140:	learn: 0.0035934	total: 15.9s	remaining: 16.9s
141:	learn: 0.0035332	total: 16s	remaining: 16.8s
142:	learn: 0.0035332	total: 16.1s	remaining: 16.6s
143:	learn: 0.0035057	total: 16.1s	remaining: 16.5s
144:	learn: 0.0034831	total: 16.2s	remaining: 16.4s
145:	learn: 0.0034831	total: 16.3s	remaining: 16.2s
146:	learn: 0.0034503	total: 16.4s	remaining: 16.1s
147:	learn: 0.0034502	total: 16.5s	remaining: 15.9s
148:	learn: 0.0034502	total: 16.6s	remaining: 15.8s
149:	learn: 0.0034051	total: 16.6s	remaining: 15.7s
150:	learn: 0.0034051	total: 16.7s	remaining: 15.5s
151:	learn: 0.0033583	total: 16.8s	remaining: 15.4s
152:	learn: 0.0033583	total: 16.9s	remaining: 15.2s
153:	learn: 0.0033583	total: 17s	remaining: 15.1s
154:	learn: 0.0033242	total: 17.1s	remaining: 15s
155:	learn: 0.0032999	total: 17.2s	remaining: 14.9s
156:	learn: 0.0032998	total: 17.3s	remaining: 14.7s
157:	learn: 0.0032998	total: 17.3s	remaining: 14.6s
158:	learn: 0.0032998	total: 17.4s	remaining: 14.4s
159:	learn: 0.0032998	total: 17.5s	remaining: 14.3s
160:	learn: 0.0032998	total: 17.5s	remaining: 14.2s
161:	learn: 0.0032998	total: 17.6s	remaining: 14s
162:	learn: 0.0032998	total: 17.7s	remaining: 13.9s
163:	learn: 0.0032998	total: 17.8s	remaining: 13.8s
164:	learn: 0.0032998	total: 17.8s	remaining: 13.6s
165:	learn: 0.0032998	total: 17.9s	remaining: 13.5s
166:	learn: 0.0032998	total: 18s	remaining: 13.4s
167:	learn: 0.0032998	total: 18.1s	remaining: 13.2s
168:	learn: 0.0032998	total: 18.2s	remaining: 13.1s
169:	learn: 0.0032998	total: 18.3s	remaining: 13s
170:	learn: 0.0032998	total: 18.3s	remaining: 12.9s
171:	learn: 0.0032997	total: 18.4s	remaining: 12.7s
172:	learn: 0.0032997	total: 18.5s	remaining: 12.6s
173:	learn: 0.0032997	total: 18.5s	remaining: 12.5s
174:	learn: 0.0032997	total: 18.6s	remaining: 12.3s
175:	learn: 0.0032997	total: 18.7s	remaining: 12.2s
176:	learn: 0.0032997	total: 18.8s	remaining: 12.1s
177:	learn: 0.0032997	total: 18.8s	remaining: 12s
178:	learn: 0.0032997	total: 18.9s	remaining: 11.8s
179:	learn: 0.0032997	total: 19s	remaining: 11.7s
180:	learn: 0.0032997	total: 19.1s	remaining: 11.6s
181:	learn: 0.0032997	total: 19.1s	remaining: 11.4s
182:	learn: 0.0032996	total: 19.2s	remaining: 11.3s
183:	learn: 0.0032996	total: 19.3s	remaining: 11.2s
184:	learn: 0.0032996	total: 19.4s	remaining: 11.1s
185:	learn: 0.0032996	total: 19.4s	remaining: 11s
186:	learn: 0.0032996	total: 19.5s	remaining: 10.9s
187:	learn: 0.0032996	total: 19.6s	remaining: 10.7s
188:	learn: 0.0032996	total: 19.7s	remaining: 10.6s
189:	learn: 0.0032996	total: 19.7s	remaining: 10.5s
190:	learn: 0.0032996	total: 19.8s	remaining: 10.4s
191:	learn: 0.0032996	total: 19.9s	remaining: 10.3s
192:	learn: 0.0032996	total: 20s	remaining: 10.2s
193:	learn: 0.0032996	total: 20.2s	remaining: 10.1s
194:	learn: 0.0032996	total: 20.3s	remaining: 10s
195:	learn: 0.0032996	total: 20.4s	remaining: 9.91s
196:	learn: 0.0032996	total: 20.6s	remaining: 9.82s
197:	learn: 0.0032996	total: 20.7s	remaining: 9.73s
198:	learn: 0.0032996	total: 20.9s	remaining: 9.65s
199:	learn: 0.0032996	total: 21s	remaining: 9.55s
200:	learn: 0.0032996	total: 21.2s	remaining: 9.47s
201:	learn: 0.0032996	total: 21.3s	remaining: 9.38s
202:	learn: 0.0032996	total: 21.4s	remaining: 9.29s
203:	learn: 0.0032996	total: 21.6s	remaining: 9.2s
204:	learn: 0.0032996	total: 21.7s	remaining: 9.1s
205:	learn: 0.0032996	total: 21.8s	remaining: 9.01s
206:	learn: 0.0032996	total: 22s	remaining: 8.92s
207:	learn: 0.0032996	total: 22.1s	remaining: 8.83s
208:	learn: 0.0032996	total: 22.3s	remaining: 8.74s
209:	learn: 0.0032996	total: 22.4s	remaining: 8.65s
210:	learn: 0.0032996	total: 22.6s	remaining: 8.55s
211:	learn: 0.0032996	total: 22.7s	remaining: 8.45s
212:	learn: 0.0032996	total: 22.8s	remaining: 8.36s
213:	learn: 0.0032996	total: 23s	remaining: 8.27s
214:	learn: 0.0032996	total: 23.1s	remaining: 8.17s
215:	learn: 0.0032996	total: 23.3s	remaining: 8.07s
216:	learn: 0.0032996	total: 23.4s	remaining: 7.98s
217:	learn: 0.0032996	total: 23.5s	remaining: 7.88s
218:	learn: 0.0032996	total: 23.7s	remaining: 7.78s
219:	learn: 0.0032996	total: 23.8s	remaining: 7.68s
220:	learn: 0.0032996	total: 23.9s	remaining: 7.58s
221:	learn: 0.0032996	total: 24.1s	remaining: 7.48s
222:	learn: 0.0032996	total: 24.2s	remaining: 7.38s
223:	learn: 0.0032996	total: 24.3s	remaining: 7.28s
224:	learn: 0.0032996	total: 24.5s	remaining: 7.18s
225:	learn: 0.0032996	total: 24.6s	remaining: 7.08s
226:	learn: 0.0032996	total: 24.8s	remaining: 6.98s
227:	learn: 0.0032996	total: 24.9s	remaining: 6.88s
228:	learn: 0.0032996	total: 25.1s	remaining: 6.79s
229:	learn: 0.0032996	total: 25.2s	remaining: 6.68s
230:	learn: 0.0032996	total: 25.3s	remaining: 6.58s
231:	learn: 0.0032996	total: 25.5s	remaining: 6.48s
232:	learn: 0.0032996	total: 25.6s	remaining: 6.37s
233:	learn: 0.0032996	total: 25.7s	remaining: 6.25s
234:	learn: 0.0032996	total: 25.7s	remaining: 6.13s
235:	learn: 0.0032996	total: 25.8s	remaining: 6.02s
236:	learn: 0.0032996	total: 25.9s	remaining: 5.9s
237:	learn: 0.0032996	total: 26s	remaining: 5.78s
238:	learn: 0.0032996	total: 26.1s	remaining: 5.67s
239:	learn: 0.0032996	total: 26.1s	remaining: 5.55s
240:	learn: 0.0032996	total: 26.2s	remaining: 5.43s
241:	learn: 0.0032996	total: 26.3s	remaining: 5.32s
242:	learn: 0.0032996	total: 26.4s	remaining: 5.21s
243:	learn: 0.0032996	total: 26.4s	remaining: 5.09s
244:	learn: 0.0032996	total: 26.5s	remaining: 4.98s
245:	learn: 0.0032996	total: 26.6s	remaining: 4.87s
246:	learn: 0.0032996	total: 26.7s	remaining: 4.75s
247:	learn: 0.0032996	total: 26.7s	remaining: 4.64s
248:	learn: 0.0032996	total: 26.8s	remaining: 4.52s
249:	learn: 0.0032996	total: 26.9s	remaining: 4.41s
250:	learn: 0.0032996	total: 27s	remaining: 4.3s
251:	learn: 0.0032996	total: 27s	remaining: 4.18s
252:	learn: 0.0032996	total: 27.1s	remaining: 4.07s
253:	learn: 0.0032996	total: 27.2s	remaining: 3.96s
254:	learn: 0.0032996	total: 27.3s	remaining: 3.85s
255:	learn: 0.0032996	total: 27.3s	remaining: 3.74s
256:	learn: 0.0032996	total: 27.4s	remaining: 3.63s
257:	learn: 0.0032996	total: 27.5s	remaining: 3.52s
258:	learn: 0.0032996	total: 27.6s	remaining: 3.4s
259:	learn: 0.0032996	total: 27.6s	remaining: 3.29s
260:	learn: 0.0032996	total: 27.7s	remaining: 3.19s
261:	learn: 0.0032996	total: 27.8s	remaining: 3.08s
262:	learn: 0.0032996	total: 27.9s	remaining: 2.97s
263:	learn: 0.0032996	total: 27.9s	remaining: 2.86s
264:	learn: 0.0032996	total: 28s	remaining: 2.75s
265:	learn: 0.0032996	total: 28.1s	remaining: 2.64s
266:	learn: 0.0032996	total: 28.2s	remaining: 2.53s
267:	learn: 0.0032996	total: 28.2s	remaining: 2.42s
268:	learn: 0.0032996	total: 28.3s	remaining: 2.31s
269:	learn: 0.0032996	total: 28.4s	remaining: 2.21s
270:	learn: 0.0032996	total: 28.5s	remaining: 2.1s
271:	learn: 0.0032996	total: 28.5s	remaining: 1.99s
272:	learn: 0.0032996	total: 28.6s	remaining: 1.89s
273:	learn: 0.0032996	total: 28.7s	remaining: 1.78s
274:	learn: 0.0032996	total: 28.8s	remaining: 1.68s
275:	learn: 0.0032996	total: 28.9s	remaining: 1.57s
276:	learn: 0.0032996	total: 28.9s	remaining: 1.46s
277:	learn: 0.0032996	total: 29s	remaining: 1.36s
278:	learn: 0.0032996	total: 29.1s	remaining: 1.25s
279:	learn: 0.0032996	total: 29.2s	remaining: 1.15s
280:	learn: 0.0032996	total: 29.2s	remaining: 1.04s
281:	learn: 0.0032996	total: 29.3s	remaining: 935ms
282:	learn: 0.0032996	total: 29.4s	remaining: 831ms
283:	learn: 0.0032996	total: 29.5s	remaining: 726ms
284:	learn: 0.0032996	total: 29.5s	remaining: 622ms
285:	learn: 0.0032996	total: 29.6s	remaining: 518ms
286:	learn: 0.0032996	total: 29.7s	remaining: 414ms
287:	learn: 0.0032996	total: 29.8s	remaining: 310ms
288:	learn: 0.0032996	total: 29.9s	remaining: 207ms
289:	learn: 0.0032996	total: 29.9s	remaining: 103ms
290:	learn: 0.0032996	total: 30s	remaining: 0us
[I 2024-12-19 14:49:00,319] Trial 33 finished with value: 80.40946207404374 and parameters: {'learning_rate': 0.09923534902000127, 'max_depth': 6, 'n_estimators': 291, 'scale_pos_weight': 5.27629579592143}. Best is trial 33 with value: 80.40946207404374.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.75
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.87
 - F1-Score_Train: 99.87
 - Precision_Test: 31.05
 - Recall_Test: 86.51
 - AUPRC_Test: 80.29
 - Accuracy_Test: 99.65
 - F1-Score_Test: 45.70
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 291
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.28
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 80.4095

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4877405	total: 76.5ms	remaining: 22.6s
1:	learn: 0.3389861	total: 153ms	remaining: 22.5s
2:	learn: 0.2422950	total: 233ms	remaining: 22.7s
3:	learn: 0.1880405	total: 342ms	remaining: 24.9s
4:	learn: 0.1485937	total: 426ms	remaining: 24.8s
5:	learn: 0.1254717	total: 503ms	remaining: 24.3s
6:	learn: 0.1075883	total: 618ms	remaining: 25.5s
7:	learn: 0.0964560	total: 698ms	remaining: 25.1s
8:	learn: 0.0889306	total: 771ms	remaining: 24.6s
9:	learn: 0.0839970	total: 865ms	remaining: 24.8s
10:	learn: 0.0751008	total: 948ms	remaining: 24.6s
11:	learn: 0.0691141	total: 1.02s	remaining: 24.2s
12:	learn: 0.0642412	total: 1.12s	remaining: 24.4s
13:	learn: 0.0597354	total: 1.2s	remaining: 24.2s
14:	learn: 0.0564783	total: 1.28s	remaining: 24s
15:	learn: 0.0540340	total: 1.39s	remaining: 24.4s
16:	learn: 0.0505584	total: 1.47s	remaining: 24.2s
17:	learn: 0.0483116	total: 1.55s	remaining: 24s
18:	learn: 0.0460502	total: 1.66s	remaining: 24.3s
19:	learn: 0.0441159	total: 1.74s	remaining: 24s
20:	learn: 0.0423426	total: 1.82s	remaining: 23.8s
21:	learn: 0.0409582	total: 1.91s	remaining: 23.8s
22:	learn: 0.0392564	total: 1.99s	remaining: 23.6s
23:	learn: 0.0378276	total: 2.07s	remaining: 23.5s
24:	learn: 0.0367196	total: 2.17s	remaining: 23.5s
25:	learn: 0.0351846	total: 2.25s	remaining: 23.3s
26:	learn: 0.0334455	total: 2.33s	remaining: 23.2s
27:	learn: 0.0319160	total: 2.43s	remaining: 23.2s
28:	learn: 0.0308747	total: 2.5s	remaining: 23s
29:	learn: 0.0297610	total: 2.58s	remaining: 22.9s
30:	learn: 0.0285865	total: 2.7s	remaining: 23.1s
31:	learn: 0.0279322	total: 2.78s	remaining: 22.9s
32:	learn: 0.0273367	total: 2.85s	remaining: 22.7s
33:	learn: 0.0265051	total: 2.95s	remaining: 22.8s
34:	learn: 0.0256822	total: 3.03s	remaining: 22.6s
35:	learn: 0.0250741	total: 3.11s	remaining: 22.5s
36:	learn: 0.0242288	total: 3.22s	remaining: 22.5s
37:	learn: 0.0236678	total: 3.29s	remaining: 22.4s
38:	learn: 0.0231388	total: 3.43s	remaining: 22.6s
39:	learn: 0.0227734	total: 3.57s	remaining: 22.8s
40:	learn: 0.0222495	total: 3.69s	remaining: 23s
41:	learn: 0.0218522	total: 3.84s	remaining: 23.2s
42:	learn: 0.0214876	total: 3.99s	remaining: 23.5s
43:	learn: 0.0209882	total: 4.13s	remaining: 23.7s
44:	learn: 0.0205461	total: 4.26s	remaining: 23.8s
45:	learn: 0.0200881	total: 4.4s	remaining: 23.9s
46:	learn: 0.0195077	total: 4.58s	remaining: 24.3s
47:	learn: 0.0191518	total: 4.74s	remaining: 24.5s
48:	learn: 0.0188380	total: 4.9s	remaining: 24.7s
49:	learn: 0.0185232	total: 5.04s	remaining: 24.8s
50:	learn: 0.0180679	total: 5.22s	remaining: 25.1s
51:	learn: 0.0173536	total: 5.38s	remaining: 25.3s
52:	learn: 0.0170181	total: 5.55s	remaining: 25.4s
53:	learn: 0.0166837	total: 5.71s	remaining: 25.6s
54:	learn: 0.0163883	total: 5.88s	remaining: 25.8s
55:	learn: 0.0161819	total: 6.04s	remaining: 25.9s
56:	learn: 0.0156039	total: 6.18s	remaining: 25.9s
57:	learn: 0.0152123	total: 6.34s	remaining: 26s
58:	learn: 0.0149376	total: 6.49s	remaining: 26.1s
59:	learn: 0.0146494	total: 6.64s	remaining: 26.1s
60:	learn: 0.0144263	total: 6.8s	remaining: 26.2s
61:	learn: 0.0141963	total: 6.95s	remaining: 26.2s
62:	learn: 0.0140815	total: 7.11s	remaining: 26.3s
63:	learn: 0.0137165	total: 7.26s	remaining: 26.3s
64:	learn: 0.0134939	total: 7.41s	remaining: 26.3s
65:	learn: 0.0131956	total: 7.55s	remaining: 26.3s
66:	learn: 0.0130136	total: 7.71s	remaining: 26.4s
67:	learn: 0.0128183	total: 7.86s	remaining: 26.3s
68:	learn: 0.0125888	total: 8.03s	remaining: 26.4s
69:	learn: 0.0123614	total: 8.2s	remaining: 26.5s
70:	learn: 0.0122007	total: 8.34s	remaining: 26.4s
71:	learn: 0.0121164	total: 8.47s	remaining: 26.3s
72:	learn: 0.0119278	total: 8.63s	remaining: 26.4s
73:	learn: 0.0117683	total: 8.79s	remaining: 26.4s
74:	learn: 0.0115130	total: 8.94s	remaining: 26.4s
75:	learn: 0.0113396	total: 9.09s	remaining: 26.3s
76:	learn: 0.0110685	total: 9.21s	remaining: 26.2s
77:	learn: 0.0107637	total: 9.29s	remaining: 26s
78:	learn: 0.0106780	total: 9.36s	remaining: 25.7s
79:	learn: 0.0105588	total: 9.47s	remaining: 25.6s
80:	learn: 0.0103526	total: 9.55s	remaining: 25.3s
81:	learn: 0.0102654	total: 9.62s	remaining: 25.1s
82:	learn: 0.0100541	total: 9.72s	remaining: 24.9s
83:	learn: 0.0098384	total: 9.8s	remaining: 24.7s
84:	learn: 0.0096724	total: 9.88s	remaining: 24.5s
85:	learn: 0.0094664	total: 10s	remaining: 24.4s
86:	learn: 0.0093207	total: 10.1s	remaining: 24.2s
87:	learn: 0.0092154	total: 10.2s	remaining: 24.1s
88:	learn: 0.0090934	total: 10.3s	remaining: 23.9s
89:	learn: 0.0089253	total: 10.4s	remaining: 23.7s
90:	learn: 0.0088038	total: 10.4s	remaining: 23.5s
91:	learn: 0.0086636	total: 10.5s	remaining: 23.4s
92:	learn: 0.0085039	total: 10.6s	remaining: 23.2s
93:	learn: 0.0083095	total: 10.7s	remaining: 23s
94:	learn: 0.0081808	total: 10.8s	remaining: 22.8s
95:	learn: 0.0080830	total: 10.9s	remaining: 22.6s
96:	learn: 0.0079868	total: 10.9s	remaining: 22.4s
97:	learn: 0.0078798	total: 11s	remaining: 22.3s
98:	learn: 0.0077945	total: 11.1s	remaining: 22.1s
99:	learn: 0.0076605	total: 11.2s	remaining: 22s
100:	learn: 0.0075074	total: 11.3s	remaining: 21.8s
101:	learn: 0.0074146	total: 11.4s	remaining: 21.6s
102:	learn: 0.0073493	total: 11.5s	remaining: 21.5s
103:	learn: 0.0072088	total: 11.6s	remaining: 21.3s
104:	learn: 0.0070932	total: 11.6s	remaining: 21.2s
105:	learn: 0.0069443	total: 11.7s	remaining: 21s
106:	learn: 0.0068692	total: 11.8s	remaining: 20.8s
107:	learn: 0.0067386	total: 11.9s	remaining: 20.7s
108:	learn: 0.0065936	total: 12s	remaining: 20.5s
109:	learn: 0.0065286	total: 12.1s	remaining: 20.4s
110:	learn: 0.0064809	total: 12.1s	remaining: 20.2s
111:	learn: 0.0063670	total: 12.2s	remaining: 20.1s
112:	learn: 0.0063316	total: 12.3s	remaining: 20s
113:	learn: 0.0062702	total: 12.4s	remaining: 19.8s
114:	learn: 0.0062292	total: 12.5s	remaining: 19.6s
115:	learn: 0.0061696	total: 12.5s	remaining: 19.5s
116:	learn: 0.0061302	total: 12.6s	remaining: 19.3s
117:	learn: 0.0060171	total: 12.7s	remaining: 19.2s
118:	learn: 0.0059555	total: 12.8s	remaining: 19s
119:	learn: 0.0058369	total: 12.9s	remaining: 18.9s
120:	learn: 0.0057929	total: 13s	remaining: 18.8s
121:	learn: 0.0057484	total: 13.1s	remaining: 18.6s
122:	learn: 0.0056760	total: 13.1s	remaining: 18.5s
123:	learn: 0.0056126	total: 13.2s	remaining: 18.3s
124:	learn: 0.0055293	total: 13.3s	remaining: 18.2s
125:	learn: 0.0054882	total: 13.4s	remaining: 18.1s
126:	learn: 0.0054785	total: 13.5s	remaining: 17.9s
127:	learn: 0.0054485	total: 13.6s	remaining: 17.8s
128:	learn: 0.0054142	total: 13.7s	remaining: 17.7s
129:	learn: 0.0053573	total: 13.7s	remaining: 17.5s
130:	learn: 0.0053066	total: 13.8s	remaining: 17.4s
131:	learn: 0.0052477	total: 13.9s	remaining: 17.3s
132:	learn: 0.0051922	total: 14s	remaining: 17.1s
133:	learn: 0.0051494	total: 14.1s	remaining: 17s
134:	learn: 0.0051028	total: 14.2s	remaining: 16.9s
135:	learn: 0.0050691	total: 14.2s	remaining: 16.8s
136:	learn: 0.0050420	total: 14.3s	remaining: 16.6s
137:	learn: 0.0050420	total: 14.4s	remaining: 16.5s
138:	learn: 0.0049629	total: 14.5s	remaining: 16.4s
139:	learn: 0.0049457	total: 14.6s	remaining: 16.2s
140:	learn: 0.0048283	total: 14.7s	remaining: 16.1s
141:	learn: 0.0047832	total: 14.7s	remaining: 16s
142:	learn: 0.0047430	total: 14.8s	remaining: 15.9s
143:	learn: 0.0047293	total: 14.9s	remaining: 15.7s
144:	learn: 0.0046864	total: 15s	remaining: 15.6s
145:	learn: 0.0046489	total: 15.1s	remaining: 15.5s
146:	learn: 0.0045961	total: 15.2s	remaining: 15.4s
147:	learn: 0.0045960	total: 15.2s	remaining: 15.2s
148:	learn: 0.0045960	total: 15.3s	remaining: 15.1s
149:	learn: 0.0045194	total: 15.4s	remaining: 15s
150:	learn: 0.0044741	total: 15.5s	remaining: 14.8s
151:	learn: 0.0044539	total: 15.5s	remaining: 14.7s
152:	learn: 0.0044244	total: 15.6s	remaining: 14.6s
153:	learn: 0.0044078	total: 15.7s	remaining: 14.5s
154:	learn: 0.0043258	total: 15.8s	remaining: 14.4s
155:	learn: 0.0042758	total: 15.9s	remaining: 14.2s
156:	learn: 0.0042758	total: 15.9s	remaining: 14.1s
157:	learn: 0.0042758	total: 16s	remaining: 14s
158:	learn: 0.0042757	total: 16s	remaining: 13.8s
159:	learn: 0.0042757	total: 16.1s	remaining: 13.7s
160:	learn: 0.0042757	total: 16.2s	remaining: 13.6s
161:	learn: 0.0042638	total: 16.3s	remaining: 13.4s
162:	learn: 0.0042093	total: 16.3s	remaining: 13.3s
163:	learn: 0.0042093	total: 16.4s	remaining: 13.2s
164:	learn: 0.0041750	total: 16.5s	remaining: 13.1s
165:	learn: 0.0041682	total: 16.6s	remaining: 13s
166:	learn: 0.0041223	total: 16.7s	remaining: 12.9s
167:	learn: 0.0040855	total: 16.7s	remaining: 12.8s
168:	learn: 0.0040855	total: 16.8s	remaining: 12.6s
169:	learn: 0.0040794	total: 16.9s	remaining: 12.5s
170:	learn: 0.0040554	total: 17s	remaining: 12.4s
171:	learn: 0.0040357	total: 17.1s	remaining: 12.3s
172:	learn: 0.0040357	total: 17.1s	remaining: 12.2s
173:	learn: 0.0039763	total: 17.2s	remaining: 12.1s
174:	learn: 0.0039241	total: 17.3s	remaining: 12s
175:	learn: 0.0039034	total: 17.4s	remaining: 11.9s
176:	learn: 0.0038626	total: 17.5s	remaining: 11.8s
177:	learn: 0.0038396	total: 17.6s	remaining: 11.6s
178:	learn: 0.0038396	total: 17.6s	remaining: 11.5s
179:	learn: 0.0038396	total: 17.7s	remaining: 11.4s
180:	learn: 0.0038395	total: 17.8s	remaining: 11.3s
181:	learn: 0.0038395	total: 17.8s	remaining: 11.2s
182:	learn: 0.0038287	total: 17.9s	remaining: 11.1s
183:	learn: 0.0038113	total: 18s	remaining: 10.9s
184:	learn: 0.0037635	total: 18.1s	remaining: 10.8s
185:	learn: 0.0037635	total: 18.1s	remaining: 10.7s
186:	learn: 0.0037635	total: 18.2s	remaining: 10.6s
187:	learn: 0.0037634	total: 18.3s	remaining: 10.5s
188:	learn: 0.0037634	total: 18.3s	remaining: 10.4s
189:	learn: 0.0037634	total: 18.4s	remaining: 10.3s
190:	learn: 0.0037634	total: 18.5s	remaining: 10.2s
191:	learn: 0.0037634	total: 18.5s	remaining: 10s
192:	learn: 0.0037634	total: 18.6s	remaining: 9.93s
193:	learn: 0.0037634	total: 18.7s	remaining: 9.82s
194:	learn: 0.0037633	total: 18.7s	remaining: 9.7s
195:	learn: 0.0037633	total: 18.8s	remaining: 9.59s
196:	learn: 0.0037633	total: 18.9s	remaining: 9.49s
197:	learn: 0.0037321	total: 19s	remaining: 9.38s
198:	learn: 0.0037321	total: 19s	remaining: 9.27s
199:	learn: 0.0037321	total: 19.1s	remaining: 9.16s
200:	learn: 0.0037321	total: 19.2s	remaining: 9.07s
201:	learn: 0.0037320	total: 19.3s	remaining: 8.98s
202:	learn: 0.0037320	total: 19.4s	remaining: 8.9s
203:	learn: 0.0037320	total: 19.6s	remaining: 8.82s
204:	learn: 0.0037320	total: 19.7s	remaining: 8.73s
205:	learn: 0.0037320	total: 19.8s	remaining: 8.65s
206:	learn: 0.0037320	total: 19.9s	remaining: 8.56s
207:	learn: 0.0037320	total: 20s	remaining: 8.47s
208:	learn: 0.0037320	total: 20.1s	remaining: 8.38s
209:	learn: 0.0037320	total: 20.3s	remaining: 8.3s
210:	learn: 0.0037320	total: 20.4s	remaining: 8.21s
211:	learn: 0.0037319	total: 20.5s	remaining: 8.13s
212:	learn: 0.0037319	total: 20.6s	remaining: 8.04s
213:	learn: 0.0037319	total: 20.7s	remaining: 7.95s
214:	learn: 0.0037319	total: 20.9s	remaining: 7.86s
215:	learn: 0.0037319	total: 21s	remaining: 7.77s
216:	learn: 0.0037319	total: 21.1s	remaining: 7.67s
217:	learn: 0.0037319	total: 21.2s	remaining: 7.59s
218:	learn: 0.0037319	total: 21.3s	remaining: 7.5s
219:	learn: 0.0037319	total: 21.5s	remaining: 7.41s
220:	learn: 0.0037319	total: 21.6s	remaining: 7.33s
221:	learn: 0.0037319	total: 21.7s	remaining: 7.24s
222:	learn: 0.0037319	total: 21.9s	remaining: 7.15s
223:	learn: 0.0037319	total: 22s	remaining: 7.06s
224:	learn: 0.0037319	total: 22.1s	remaining: 6.97s
225:	learn: 0.0037319	total: 22.2s	remaining: 6.89s
226:	learn: 0.0037319	total: 22.4s	remaining: 6.8s
227:	learn: 0.0037319	total: 22.5s	remaining: 6.71s
228:	learn: 0.0037319	total: 22.6s	remaining: 6.62s
229:	learn: 0.0037319	total: 22.7s	remaining: 6.53s
230:	learn: 0.0037319	total: 22.9s	remaining: 6.43s
231:	learn: 0.0037319	total: 23s	remaining: 6.34s
232:	learn: 0.0037319	total: 23.1s	remaining: 6.25s
233:	learn: 0.0037319	total: 23.2s	remaining: 6.16s
234:	learn: 0.0037319	total: 23.4s	remaining: 6.07s
235:	learn: 0.0037319	total: 23.5s	remaining: 5.98s
236:	learn: 0.0037319	total: 23.7s	remaining: 5.89s
237:	learn: 0.0037319	total: 23.8s	remaining: 5.8s
238:	learn: 0.0037319	total: 23.9s	remaining: 5.7s
239:	learn: 0.0037319	total: 24s	remaining: 5.61s
240:	learn: 0.0037319	total: 24.2s	remaining: 5.52s
241:	learn: 0.0037319	total: 24.3s	remaining: 5.42s
242:	learn: 0.0037319	total: 24.4s	remaining: 5.33s
243:	learn: 0.0037319	total: 24.6s	remaining: 5.24s
244:	learn: 0.0037319	total: 24.7s	remaining: 5.14s
245:	learn: 0.0037319	total: 24.8s	remaining: 5.04s
246:	learn: 0.0037319	total: 24.9s	remaining: 4.95s
247:	learn: 0.0037319	total: 25.1s	remaining: 4.85s
248:	learn: 0.0037319	total: 25.2s	remaining: 4.75s
249:	learn: 0.0037319	total: 25.3s	remaining: 4.65s
250:	learn: 0.0037319	total: 25.3s	remaining: 4.54s
251:	learn: 0.0037319	total: 25.4s	remaining: 4.43s
252:	learn: 0.0037319	total: 25.5s	remaining: 4.33s
253:	learn: 0.0037319	total: 25.5s	remaining: 4.22s
254:	learn: 0.0037319	total: 25.6s	remaining: 4.11s
255:	learn: 0.0037319	total: 25.7s	remaining: 4.01s
256:	learn: 0.0037319	total: 25.7s	remaining: 3.9s
257:	learn: 0.0037319	total: 25.8s	remaining: 3.8s
258:	learn: 0.0037319	total: 25.9s	remaining: 3.7s
259:	learn: 0.0037319	total: 25.9s	remaining: 3.59s
260:	learn: 0.0037319	total: 26s	remaining: 3.49s
261:	learn: 0.0037319	total: 26.1s	remaining: 3.38s
262:	learn: 0.0037319	total: 26.1s	remaining: 3.28s
263:	learn: 0.0037318	total: 26.2s	remaining: 3.18s
264:	learn: 0.0037318	total: 26.3s	remaining: 3.08s
265:	learn: 0.0037318	total: 26.3s	remaining: 2.97s
266:	learn: 0.0037318	total: 26.4s	remaining: 2.87s
267:	learn: 0.0037318	total: 26.5s	remaining: 2.77s
268:	learn: 0.0037318	total: 26.6s	remaining: 2.67s
269:	learn: 0.0037318	total: 26.6s	remaining: 2.56s
270:	learn: 0.0037318	total: 26.7s	remaining: 2.46s
271:	learn: 0.0037318	total: 26.7s	remaining: 2.36s
272:	learn: 0.0037318	total: 26.8s	remaining: 2.26s
273:	learn: 0.0037318	total: 26.9s	remaining: 2.16s
274:	learn: 0.0037317	total: 27s	remaining: 2.06s
275:	learn: 0.0037317	total: 27s	remaining: 1.96s
276:	learn: 0.0037317	total: 27.1s	remaining: 1.86s
277:	learn: 0.0037317	total: 27.2s	remaining: 1.76s
278:	learn: 0.0037317	total: 27.2s	remaining: 1.66s
279:	learn: 0.0037317	total: 27.3s	remaining: 1.56s
280:	learn: 0.0037317	total: 27.4s	remaining: 1.46s
281:	learn: 0.0037317	total: 27.4s	remaining: 1.36s
282:	learn: 0.0037317	total: 27.5s	remaining: 1.26s
283:	learn: 0.0037317	total: 27.6s	remaining: 1.17s
284:	learn: 0.0037317	total: 27.6s	remaining: 1.07s
285:	learn: 0.0037317	total: 27.7s	remaining: 968ms
286:	learn: 0.0037316	total: 27.8s	remaining: 871ms
287:	learn: 0.0037317	total: 27.8s	remaining: 773ms
288:	learn: 0.0037317	total: 27.9s	remaining: 676ms
289:	learn: 0.0037042	total: 28s	remaining: 580ms
290:	learn: 0.0037042	total: 28.1s	remaining: 482ms
291:	learn: 0.0036963	total: 28.2s	remaining: 386ms
292:	learn: 0.0036672	total: 28.2s	remaining: 289ms
293:	learn: 0.0036382	total: 28.4s	remaining: 193ms
294:	learn: 0.0036049	total: 28.4s	remaining: 96.4ms
295:	learn: 0.0035900	total: 28.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.69
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 25.60
 - Recall_Test: 84.92
 - AUPRC_Test: 79.03
 - Accuracy_Test: 99.56
 - F1-Score_Test: 39.34
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4948363	total: 80.8ms	remaining: 23.8s
1:	learn: 0.3627427	total: 160ms	remaining: 23.6s
2:	learn: 0.2711008	total: 241ms	remaining: 23.5s
3:	learn: 0.2206589	total: 358ms	remaining: 26.1s
4:	learn: 0.1890859	total: 436ms	remaining: 25.4s
5:	learn: 0.1624794	total: 525ms	remaining: 25.4s
6:	learn: 0.1434857	total: 626ms	remaining: 25.8s
7:	learn: 0.1286784	total: 705ms	remaining: 25.4s
8:	learn: 0.1188180	total: 779ms	remaining: 24.8s
9:	learn: 0.1059125	total: 885ms	remaining: 25.3s
10:	learn: 0.0998139	total: 960ms	remaining: 24.9s
11:	learn: 0.0945433	total: 1.04s	remaining: 24.6s
12:	learn: 0.0897870	total: 1.14s	remaining: 24.7s
13:	learn: 0.0850561	total: 1.21s	remaining: 24.4s
14:	learn: 0.0802890	total: 1.29s	remaining: 24.2s
15:	learn: 0.0773171	total: 1.4s	remaining: 24.5s
16:	learn: 0.0734997	total: 1.48s	remaining: 24.3s
17:	learn: 0.0711008	total: 1.56s	remaining: 24.1s
18:	learn: 0.0684428	total: 1.66s	remaining: 24.2s
19:	learn: 0.0658773	total: 1.74s	remaining: 24s
20:	learn: 0.0633526	total: 1.83s	remaining: 23.9s
21:	learn: 0.0610831	total: 1.93s	remaining: 24s
22:	learn: 0.0589596	total: 2s	remaining: 23.8s
23:	learn: 0.0568965	total: 2.08s	remaining: 23.6s
24:	learn: 0.0556696	total: 2.18s	remaining: 23.7s
25:	learn: 0.0541509	total: 2.29s	remaining: 23.7s
26:	learn: 0.0525592	total: 2.36s	remaining: 23.5s
27:	learn: 0.0513195	total: 2.48s	remaining: 23.7s
28:	learn: 0.0499494	total: 2.55s	remaining: 23.5s
29:	learn: 0.0484948	total: 2.62s	remaining: 23.3s
30:	learn: 0.0464743	total: 2.73s	remaining: 23.3s
31:	learn: 0.0453160	total: 2.81s	remaining: 23.2s
32:	learn: 0.0444042	total: 2.88s	remaining: 22.9s
33:	learn: 0.0427632	total: 2.98s	remaining: 23s
34:	learn: 0.0413938	total: 3.05s	remaining: 22.8s
35:	learn: 0.0403300	total: 3.13s	remaining: 22.6s
36:	learn: 0.0390059	total: 3.23s	remaining: 22.6s
37:	learn: 0.0380121	total: 3.31s	remaining: 22.5s
38:	learn: 0.0369717	total: 3.39s	remaining: 22.3s
39:	learn: 0.0360700	total: 3.5s	remaining: 22.4s
40:	learn: 0.0346399	total: 3.58s	remaining: 22.3s
41:	learn: 0.0336497	total: 3.66s	remaining: 22.1s
42:	learn: 0.0329008	total: 3.76s	remaining: 22.1s
43:	learn: 0.0319869	total: 3.84s	remaining: 22s
44:	learn: 0.0308620	total: 3.92s	remaining: 21.9s
45:	learn: 0.0302271	total: 4.03s	remaining: 21.9s
46:	learn: 0.0293880	total: 4.12s	remaining: 21.8s
47:	learn: 0.0289551	total: 4.19s	remaining: 21.7s
48:	learn: 0.0281166	total: 4.3s	remaining: 21.7s
49:	learn: 0.0273843	total: 4.37s	remaining: 21.5s
50:	learn: 0.0269518	total: 4.45s	remaining: 21.4s
51:	learn: 0.0264061	total: 4.61s	remaining: 21.6s
52:	learn: 0.0257046	total: 4.73s	remaining: 21.7s
53:	learn: 0.0253163	total: 4.87s	remaining: 21.8s
54:	learn: 0.0247009	total: 5s	remaining: 21.9s
55:	learn: 0.0241096	total: 5.15s	remaining: 22.1s
56:	learn: 0.0236395	total: 5.31s	remaining: 22.3s
57:	learn: 0.0229662	total: 5.46s	remaining: 22.4s
58:	learn: 0.0224968	total: 5.63s	remaining: 22.6s
59:	learn: 0.0223038	total: 5.78s	remaining: 22.7s
60:	learn: 0.0219639	total: 5.93s	remaining: 22.8s
61:	learn: 0.0216684	total: 6.1s	remaining: 23s
62:	learn: 0.0210704	total: 6.25s	remaining: 23.1s
63:	learn: 0.0206635	total: 6.42s	remaining: 23.3s
64:	learn: 0.0202253	total: 6.57s	remaining: 23.3s
65:	learn: 0.0196893	total: 6.75s	remaining: 23.5s
66:	learn: 0.0194090	total: 6.88s	remaining: 23.5s
67:	learn: 0.0187772	total: 7.07s	remaining: 23.7s
68:	learn: 0.0183846	total: 7.2s	remaining: 23.7s
69:	learn: 0.0179451	total: 7.36s	remaining: 23.8s
70:	learn: 0.0174814	total: 7.52s	remaining: 23.8s
71:	learn: 0.0173287	total: 7.69s	remaining: 23.9s
72:	learn: 0.0169865	total: 7.86s	remaining: 24s
73:	learn: 0.0167308	total: 8.04s	remaining: 24.1s
74:	learn: 0.0164760	total: 8.19s	remaining: 24.1s
75:	learn: 0.0162959	total: 8.33s	remaining: 24.1s
76:	learn: 0.0159457	total: 8.47s	remaining: 24.1s
77:	learn: 0.0156596	total: 8.6s	remaining: 24s
78:	learn: 0.0154897	total: 8.75s	remaining: 24s
79:	learn: 0.0151888	total: 8.92s	remaining: 24.1s
80:	learn: 0.0149893	total: 9.08s	remaining: 24.1s
81:	learn: 0.0147849	total: 9.24s	remaining: 24.1s
82:	learn: 0.0145521	total: 9.4s	remaining: 24.1s
83:	learn: 0.0143860	total: 9.58s	remaining: 24.2s
84:	learn: 0.0140762	total: 9.74s	remaining: 24.2s
85:	learn: 0.0138939	total: 9.9s	remaining: 24.2s
86:	learn: 0.0134917	total: 10s	remaining: 24.1s
87:	learn: 0.0133863	total: 10.1s	remaining: 23.9s
88:	learn: 0.0132899	total: 10.2s	remaining: 23.7s
89:	learn: 0.0130703	total: 10.3s	remaining: 23.5s
90:	learn: 0.0129450	total: 10.4s	remaining: 23.3s
91:	learn: 0.0127255	total: 10.5s	remaining: 23.2s
92:	learn: 0.0125726	total: 10.5s	remaining: 23s
93:	learn: 0.0124259	total: 10.6s	remaining: 22.8s
94:	learn: 0.0122478	total: 10.7s	remaining: 22.7s
95:	learn: 0.0120118	total: 10.8s	remaining: 22.5s
96:	learn: 0.0119123	total: 10.9s	remaining: 22.3s
97:	learn: 0.0117035	total: 11s	remaining: 22.2s
98:	learn: 0.0115287	total: 11.1s	remaining: 22s
99:	learn: 0.0114029	total: 11.2s	remaining: 21.9s
100:	learn: 0.0112679	total: 11.3s	remaining: 21.7s
101:	learn: 0.0110892	total: 11.3s	remaining: 21.6s
102:	learn: 0.0110063	total: 11.4s	remaining: 21.4s
103:	learn: 0.0108581	total: 11.5s	remaining: 21.2s
104:	learn: 0.0106479	total: 11.6s	remaining: 21.1s
105:	learn: 0.0105371	total: 11.7s	remaining: 20.9s
106:	learn: 0.0102931	total: 11.8s	remaining: 20.8s
107:	learn: 0.0101848	total: 11.8s	remaining: 20.6s
108:	learn: 0.0100086	total: 11.9s	remaining: 20.5s
109:	learn: 0.0098876	total: 12s	remaining: 20.3s
110:	learn: 0.0097982	total: 12.1s	remaining: 20.2s
111:	learn: 0.0097066	total: 12.2s	remaining: 20s
112:	learn: 0.0095587	total: 12.3s	remaining: 19.9s
113:	learn: 0.0094918	total: 12.3s	remaining: 19.7s
114:	learn: 0.0093886	total: 12.4s	remaining: 19.5s
115:	learn: 0.0092897	total: 12.5s	remaining: 19.4s
116:	learn: 0.0091753	total: 12.6s	remaining: 19.3s
117:	learn: 0.0089798	total: 12.7s	remaining: 19.1s
118:	learn: 0.0088977	total: 12.8s	remaining: 19s
119:	learn: 0.0087060	total: 12.9s	remaining: 18.9s
120:	learn: 0.0086452	total: 12.9s	remaining: 18.7s
121:	learn: 0.0085407	total: 13.1s	remaining: 18.6s
122:	learn: 0.0084403	total: 13.1s	remaining: 18.5s
123:	learn: 0.0082812	total: 13.2s	remaining: 18.3s
124:	learn: 0.0082408	total: 13.3s	remaining: 18.2s
125:	learn: 0.0081753	total: 13.4s	remaining: 18s
126:	learn: 0.0080666	total: 13.5s	remaining: 17.9s
127:	learn: 0.0080051	total: 13.5s	remaining: 17.8s
128:	learn: 0.0079104	total: 13.6s	remaining: 17.6s
129:	learn: 0.0078245	total: 13.7s	remaining: 17.5s
130:	learn: 0.0077893	total: 13.8s	remaining: 17.3s
131:	learn: 0.0077483	total: 13.8s	remaining: 17.2s
132:	learn: 0.0076314	total: 13.9s	remaining: 17.1s
133:	learn: 0.0075292	total: 14s	remaining: 17s
134:	learn: 0.0074743	total: 14.1s	remaining: 16.8s
135:	learn: 0.0073733	total: 14.2s	remaining: 16.7s
136:	learn: 0.0073040	total: 14.3s	remaining: 16.6s
137:	learn: 0.0072631	total: 14.4s	remaining: 16.5s
138:	learn: 0.0072041	total: 14.4s	remaining: 16.3s
139:	learn: 0.0070673	total: 14.6s	remaining: 16.2s
140:	learn: 0.0069907	total: 14.7s	remaining: 16.1s
141:	learn: 0.0069459	total: 14.7s	remaining: 16s
142:	learn: 0.0068843	total: 14.8s	remaining: 15.9s
143:	learn: 0.0068191	total: 14.9s	remaining: 15.7s
144:	learn: 0.0068192	total: 15s	remaining: 15.6s
145:	learn: 0.0067215	total: 15.1s	remaining: 15.5s
146:	learn: 0.0066051	total: 15.2s	remaining: 15.4s
147:	learn: 0.0065180	total: 15.2s	remaining: 15.2s
148:	learn: 0.0064330	total: 15.4s	remaining: 15.1s
149:	learn: 0.0063278	total: 15.4s	remaining: 15s
150:	learn: 0.0062411	total: 15.5s	remaining: 14.9s
151:	learn: 0.0062153	total: 15.6s	remaining: 14.8s
152:	learn: 0.0061808	total: 15.7s	remaining: 14.7s
153:	learn: 0.0061382	total: 15.8s	remaining: 14.5s
154:	learn: 0.0060636	total: 15.9s	remaining: 14.4s
155:	learn: 0.0060183	total: 15.9s	remaining: 14.3s
156:	learn: 0.0059590	total: 16s	remaining: 14.2s
157:	learn: 0.0059076	total: 16.1s	remaining: 14.1s
158:	learn: 0.0059075	total: 16.2s	remaining: 13.9s
159:	learn: 0.0059075	total: 16.2s	remaining: 13.8s
160:	learn: 0.0059075	total: 16.3s	remaining: 13.7s
161:	learn: 0.0057938	total: 16.4s	remaining: 13.6s
162:	learn: 0.0057169	total: 16.5s	remaining: 13.5s
163:	learn: 0.0056665	total: 16.6s	remaining: 13.3s
164:	learn: 0.0056033	total: 16.7s	remaining: 13.2s
165:	learn: 0.0055523	total: 16.7s	remaining: 13.1s
166:	learn: 0.0055084	total: 16.8s	remaining: 13s
167:	learn: 0.0054770	total: 16.9s	remaining: 12.9s
168:	learn: 0.0054369	total: 17s	remaining: 12.8s
169:	learn: 0.0054099	total: 17.1s	remaining: 12.7s
170:	learn: 0.0053886	total: 17.2s	remaining: 12.5s
171:	learn: 0.0053885	total: 17.2s	remaining: 12.4s
172:	learn: 0.0053885	total: 17.3s	remaining: 12.3s
173:	learn: 0.0053885	total: 17.3s	remaining: 12.2s
174:	learn: 0.0053885	total: 17.4s	remaining: 12.1s
175:	learn: 0.0053884	total: 17.5s	remaining: 11.9s
176:	learn: 0.0053884	total: 17.5s	remaining: 11.8s
177:	learn: 0.0053884	total: 17.6s	remaining: 11.7s
178:	learn: 0.0053884	total: 17.7s	remaining: 11.6s
179:	learn: 0.0053884	total: 17.7s	remaining: 11.4s
180:	learn: 0.0053884	total: 17.8s	remaining: 11.3s
181:	learn: 0.0053883	total: 17.9s	remaining: 11.2s
182:	learn: 0.0053883	total: 17.9s	remaining: 11.1s
183:	learn: 0.0053883	total: 18s	remaining: 11s
184:	learn: 0.0053882	total: 18.1s	remaining: 10.8s
185:	learn: 0.0053882	total: 18.1s	remaining: 10.7s
186:	learn: 0.0053881	total: 18.2s	remaining: 10.6s
187:	learn: 0.0053882	total: 18.3s	remaining: 10.5s
188:	learn: 0.0053881	total: 18.4s	remaining: 10.4s
189:	learn: 0.0053881	total: 18.4s	remaining: 10.3s
190:	learn: 0.0053881	total: 18.5s	remaining: 10.2s
191:	learn: 0.0053881	total: 18.6s	remaining: 10.1s
192:	learn: 0.0053880	total: 18.6s	remaining: 9.94s
193:	learn: 0.0053880	total: 18.7s	remaining: 9.82s
194:	learn: 0.0053880	total: 18.8s	remaining: 9.72s
195:	learn: 0.0053880	total: 18.8s	remaining: 9.6s
196:	learn: 0.0053880	total: 18.9s	remaining: 9.49s
197:	learn: 0.0053879	total: 18.9s	remaining: 9.37s
198:	learn: 0.0053880	total: 19s	remaining: 9.27s
199:	learn: 0.0053879	total: 19.1s	remaining: 9.16s
200:	learn: 0.0053879	total: 19.1s	remaining: 9.05s
201:	learn: 0.0053879	total: 19.2s	remaining: 8.94s
202:	learn: 0.0053879	total: 19.3s	remaining: 8.84s
203:	learn: 0.0053879	total: 19.4s	remaining: 8.73s
204:	learn: 0.0053879	total: 19.4s	remaining: 8.62s
205:	learn: 0.0053878	total: 19.5s	remaining: 8.51s
206:	learn: 0.0053878	total: 19.6s	remaining: 8.41s
207:	learn: 0.0053600	total: 19.6s	remaining: 8.3s
208:	learn: 0.0053600	total: 19.7s	remaining: 8.2s
209:	learn: 0.0053600	total: 19.7s	remaining: 8.09s
210:	learn: 0.0053600	total: 19.8s	remaining: 7.99s
211:	learn: 0.0053599	total: 19.9s	remaining: 7.88s
212:	learn: 0.0053600	total: 20s	remaining: 7.78s
213:	learn: 0.0053600	total: 20.1s	remaining: 7.7s
214:	learn: 0.0053599	total: 20.2s	remaining: 7.61s
215:	learn: 0.0053599	total: 20.3s	remaining: 7.53s
216:	learn: 0.0053599	total: 20.4s	remaining: 7.44s
217:	learn: 0.0053599	total: 20.5s	remaining: 7.34s
218:	learn: 0.0053598	total: 20.7s	remaining: 7.26s
219:	learn: 0.0053598	total: 20.8s	remaining: 7.18s
220:	learn: 0.0053598	total: 20.9s	remaining: 7.09s
221:	learn: 0.0053598	total: 21s	remaining: 7.01s
222:	learn: 0.0053598	total: 21.2s	remaining: 6.93s
223:	learn: 0.0053597	total: 21.3s	remaining: 6.84s
224:	learn: 0.0053597	total: 21.4s	remaining: 6.76s
225:	learn: 0.0053597	total: 21.5s	remaining: 6.67s
226:	learn: 0.0053597	total: 21.6s	remaining: 6.58s
227:	learn: 0.0053597	total: 21.8s	remaining: 6.49s
228:	learn: 0.0053597	total: 21.9s	remaining: 6.4s
229:	learn: 0.0053596	total: 22s	remaining: 6.3s
230:	learn: 0.0053596	total: 22.1s	remaining: 6.22s
231:	learn: 0.0053596	total: 22.3s	remaining: 6.14s
232:	learn: 0.0053596	total: 22.4s	remaining: 6.05s
233:	learn: 0.0053596	total: 22.5s	remaining: 5.96s
234:	learn: 0.0053596	total: 22.6s	remaining: 5.87s
235:	learn: 0.0053596	total: 22.7s	remaining: 5.78s
236:	learn: 0.0053596	total: 22.9s	remaining: 5.69s
237:	learn: 0.0053596	total: 23s	remaining: 5.6s
238:	learn: 0.0053596	total: 23.1s	remaining: 5.51s
239:	learn: 0.0053596	total: 23.2s	remaining: 5.42s
240:	learn: 0.0053596	total: 23.4s	remaining: 5.33s
241:	learn: 0.0053596	total: 23.5s	remaining: 5.24s
242:	learn: 0.0053596	total: 23.6s	remaining: 5.15s
243:	learn: 0.0053596	total: 23.7s	remaining: 5.06s
244:	learn: 0.0053596	total: 23.9s	remaining: 4.97s
245:	learn: 0.0053595	total: 24s	remaining: 4.88s
246:	learn: 0.0053595	total: 24.1s	remaining: 4.78s
247:	learn: 0.0053595	total: 24.2s	remaining: 4.69s
248:	learn: 0.0053595	total: 24.3s	remaining: 4.59s
249:	learn: 0.0053595	total: 24.5s	remaining: 4.5s
250:	learn: 0.0053594	total: 24.6s	remaining: 4.41s
251:	learn: 0.0053594	total: 24.7s	remaining: 4.32s
252:	learn: 0.0053594	total: 24.9s	remaining: 4.22s
253:	learn: 0.0053594	total: 25s	remaining: 4.13s
254:	learn: 0.0053594	total: 25.1s	remaining: 4.04s
255:	learn: 0.0053594	total: 25.2s	remaining: 3.94s
256:	learn: 0.0053594	total: 25.4s	remaining: 3.85s
257:	learn: 0.0053594	total: 25.5s	remaining: 3.75s
258:	learn: 0.0053594	total: 25.6s	remaining: 3.66s
259:	learn: 0.0053593	total: 25.7s	remaining: 3.56s
260:	learn: 0.0053593	total: 25.8s	remaining: 3.46s
261:	learn: 0.0053593	total: 25.9s	remaining: 3.37s
262:	learn: 0.0053593	total: 26.1s	remaining: 3.27s
263:	learn: 0.0053593	total: 26.2s	remaining: 3.18s
264:	learn: 0.0053593	total: 26.3s	remaining: 3.07s
265:	learn: 0.0053593	total: 26.4s	remaining: 2.97s
266:	learn: 0.0053016	total: 26.4s	remaining: 2.87s
267:	learn: 0.0053016	total: 26.5s	remaining: 2.77s
268:	learn: 0.0053016	total: 26.6s	remaining: 2.67s
269:	learn: 0.0053016	total: 26.6s	remaining: 2.57s
270:	learn: 0.0053016	total: 26.7s	remaining: 2.46s
271:	learn: 0.0053016	total: 26.8s	remaining: 2.36s
272:	learn: 0.0053015	total: 26.8s	remaining: 2.26s
273:	learn: 0.0053015	total: 26.9s	remaining: 2.16s
274:	learn: 0.0053015	total: 27s	remaining: 2.06s
275:	learn: 0.0053015	total: 27s	remaining: 1.96s
276:	learn: 0.0053015	total: 27.1s	remaining: 1.86s
277:	learn: 0.0053015	total: 27.2s	remaining: 1.76s
278:	learn: 0.0053015	total: 27.2s	remaining: 1.66s
279:	learn: 0.0053014	total: 27.3s	remaining: 1.56s
280:	learn: 0.0053014	total: 27.4s	remaining: 1.46s
281:	learn: 0.0053013	total: 27.4s	remaining: 1.36s
282:	learn: 0.0053013	total: 27.5s	remaining: 1.26s
283:	learn: 0.0053013	total: 27.6s	remaining: 1.16s
284:	learn: 0.0053012	total: 27.6s	remaining: 1.07s
285:	learn: 0.0053012	total: 27.7s	remaining: 968ms
286:	learn: 0.0053012	total: 27.8s	remaining: 870ms
287:	learn: 0.0053012	total: 27.8s	remaining: 773ms
288:	learn: 0.0053012	total: 27.9s	remaining: 676ms
289:	learn: 0.0053011	total: 28s	remaining: 578ms
290:	learn: 0.0053010	total: 28s	remaining: 481ms
291:	learn: 0.0053010	total: 28.1s	remaining: 384ms
292:	learn: 0.0052593	total: 28.2s	remaining: 288ms
293:	learn: 0.0052593	total: 28.2s	remaining: 192ms
294:	learn: 0.0052592	total: 28.3s	remaining: 95.9ms
295:	learn: 0.0052593	total: 28.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.56
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.78
 - F1-Score_Train: 99.78
 - Precision_Test: 22.24
 - Recall_Test: 88.10
 - AUPRC_Test: 74.99
 - Accuracy_Test: 99.46
 - F1-Score_Test: 35.52
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4832185	total: 77.6ms	remaining: 22.9s
1:	learn: 0.3486969	total: 166ms	remaining: 24.5s
2:	learn: 0.2450876	total: 249ms	remaining: 24.3s
3:	learn: 0.2025075	total: 342ms	remaining: 25s
4:	learn: 0.1699332	total: 420ms	remaining: 24.5s
5:	learn: 0.1472714	total: 497ms	remaining: 24s
6:	learn: 0.1272893	total: 596ms	remaining: 24.6s
7:	learn: 0.1144256	total: 677ms	remaining: 24.4s
8:	learn: 0.1040633	total: 752ms	remaining: 24s
9:	learn: 0.0971919	total: 848ms	remaining: 24.3s
10:	learn: 0.0914775	total: 933ms	remaining: 24.2s
11:	learn: 0.0844356	total: 1.01s	remaining: 24s
12:	learn: 0.0810773	total: 1.11s	remaining: 24.3s
13:	learn: 0.0743123	total: 1.21s	remaining: 24.3s
14:	learn: 0.0715102	total: 1.28s	remaining: 24s
15:	learn: 0.0677185	total: 1.39s	remaining: 24.3s
16:	learn: 0.0653299	total: 1.47s	remaining: 24.1s
17:	learn: 0.0623620	total: 1.55s	remaining: 23.9s
18:	learn: 0.0602566	total: 1.65s	remaining: 24s
19:	learn: 0.0579759	total: 1.72s	remaining: 23.8s
20:	learn: 0.0564913	total: 1.8s	remaining: 23.6s
21:	learn: 0.0541803	total: 1.9s	remaining: 23.7s
22:	learn: 0.0527296	total: 1.98s	remaining: 23.5s
23:	learn: 0.0508339	total: 2.05s	remaining: 23.3s
24:	learn: 0.0491513	total: 2.17s	remaining: 23.5s
25:	learn: 0.0472340	total: 2.26s	remaining: 23.5s
26:	learn: 0.0456949	total: 2.35s	remaining: 23.5s
27:	learn: 0.0437139	total: 2.45s	remaining: 23.5s
28:	learn: 0.0419479	total: 2.53s	remaining: 23.3s
29:	learn: 0.0404967	total: 2.62s	remaining: 23.2s
30:	learn: 0.0394472	total: 2.73s	remaining: 23.3s
31:	learn: 0.0383904	total: 2.81s	remaining: 23.2s
32:	learn: 0.0374576	total: 2.89s	remaining: 23s
33:	learn: 0.0362409	total: 3s	remaining: 23.1s
34:	learn: 0.0350638	total: 3.09s	remaining: 23s
35:	learn: 0.0339930	total: 3.17s	remaining: 22.9s
36:	learn: 0.0328409	total: 3.29s	remaining: 23s
37:	learn: 0.0324880	total: 3.35s	remaining: 22.8s
38:	learn: 0.0315140	total: 3.43s	remaining: 22.6s
39:	learn: 0.0307168	total: 3.54s	remaining: 22.6s
40:	learn: 0.0296951	total: 3.62s	remaining: 22.5s
41:	learn: 0.0292117	total: 3.7s	remaining: 22.4s
42:	learn: 0.0287827	total: 3.8s	remaining: 22.4s
43:	learn: 0.0280559	total: 3.88s	remaining: 22.2s
44:	learn: 0.0274097	total: 3.96s	remaining: 22.1s
45:	learn: 0.0266370	total: 4.07s	remaining: 22.1s
46:	learn: 0.0260364	total: 4.15s	remaining: 22s
47:	learn: 0.0254700	total: 4.22s	remaining: 21.8s
48:	learn: 0.0250437	total: 4.34s	remaining: 21.9s
49:	learn: 0.0246869	total: 4.42s	remaining: 21.7s
50:	learn: 0.0240232	total: 4.5s	remaining: 21.6s
51:	learn: 0.0235820	total: 4.6s	remaining: 21.6s
52:	learn: 0.0229642	total: 4.68s	remaining: 21.4s
53:	learn: 0.0224454	total: 4.76s	remaining: 21.3s
54:	learn: 0.0220607	total: 4.86s	remaining: 21.3s
55:	learn: 0.0216148	total: 4.94s	remaining: 21.2s
56:	learn: 0.0211486	total: 5.01s	remaining: 21s
57:	learn: 0.0207318	total: 5.11s	remaining: 21s
58:	learn: 0.0202808	total: 5.19s	remaining: 20.9s
59:	learn: 0.0197412	total: 5.28s	remaining: 20.7s
60:	learn: 0.0193070	total: 5.38s	remaining: 20.7s
61:	learn: 0.0190660	total: 5.46s	remaining: 20.6s
62:	learn: 0.0187224	total: 5.54s	remaining: 20.5s
63:	learn: 0.0184790	total: 5.65s	remaining: 20.5s
64:	learn: 0.0180575	total: 5.77s	remaining: 20.5s
65:	learn: 0.0177161	total: 5.94s	remaining: 20.7s
66:	learn: 0.0171794	total: 6.09s	remaining: 20.8s
67:	learn: 0.0167626	total: 6.26s	remaining: 21s
68:	learn: 0.0164743	total: 6.41s	remaining: 21.1s
69:	learn: 0.0162266	total: 6.58s	remaining: 21.2s
70:	learn: 0.0159779	total: 6.72s	remaining: 21.3s
71:	learn: 0.0156549	total: 6.9s	remaining: 21.5s
72:	learn: 0.0154040	total: 7.05s	remaining: 21.5s
73:	learn: 0.0150722	total: 7.21s	remaining: 21.6s
74:	learn: 0.0149133	total: 7.35s	remaining: 21.7s
75:	learn: 0.0146154	total: 7.51s	remaining: 21.7s
76:	learn: 0.0144015	total: 7.66s	remaining: 21.8s
77:	learn: 0.0141924	total: 7.82s	remaining: 21.8s
78:	learn: 0.0139108	total: 7.97s	remaining: 21.9s
79:	learn: 0.0135571	total: 8.14s	remaining: 22s
80:	learn: 0.0133575	total: 8.3s	remaining: 22s
81:	learn: 0.0131122	total: 8.45s	remaining: 22.1s
82:	learn: 0.0129436	total: 8.6s	remaining: 22.1s
83:	learn: 0.0127928	total: 8.76s	remaining: 22.1s
84:	learn: 0.0125656	total: 8.92s	remaining: 22.1s
85:	learn: 0.0124186	total: 9.09s	remaining: 22.2s
86:	learn: 0.0121075	total: 9.25s	remaining: 22.2s
87:	learn: 0.0120102	total: 9.41s	remaining: 22.2s
88:	learn: 0.0117171	total: 9.56s	remaining: 22.2s
89:	learn: 0.0116290	total: 9.73s	remaining: 22.3s
90:	learn: 0.0114058	total: 9.87s	remaining: 22.2s
91:	learn: 0.0112856	total: 10s	remaining: 22.3s
92:	learn: 0.0111118	total: 10.2s	remaining: 22.3s
93:	learn: 0.0109014	total: 10.4s	remaining: 22.3s
94:	learn: 0.0105984	total: 10.5s	remaining: 22.3s
95:	learn: 0.0103732	total: 10.7s	remaining: 22.3s
96:	learn: 0.0102383	total: 10.9s	remaining: 22.3s
97:	learn: 0.0101160	total: 11s	remaining: 22.3s
98:	learn: 0.0099749	total: 11.2s	remaining: 22.2s
99:	learn: 0.0098586	total: 11.3s	remaining: 22.2s
100:	learn: 0.0097115	total: 11.4s	remaining: 22.1s
101:	learn: 0.0095809	total: 11.5s	remaining: 21.9s
102:	learn: 0.0093755	total: 11.6s	remaining: 21.7s
103:	learn: 0.0092305	total: 11.7s	remaining: 21.6s
104:	learn: 0.0090095	total: 11.8s	remaining: 21.4s
105:	learn: 0.0089280	total: 11.9s	remaining: 21.3s
106:	learn: 0.0088579	total: 11.9s	remaining: 21.1s
107:	learn: 0.0085905	total: 12s	remaining: 20.9s
108:	learn: 0.0085203	total: 12.1s	remaining: 20.8s
109:	learn: 0.0083861	total: 12.2s	remaining: 20.6s
110:	learn: 0.0083164	total: 12.3s	remaining: 20.5s
111:	learn: 0.0082608	total: 12.4s	remaining: 20.3s
112:	learn: 0.0081597	total: 12.4s	remaining: 20.2s
113:	learn: 0.0080225	total: 12.5s	remaining: 20s
114:	learn: 0.0078593	total: 12.6s	remaining: 19.9s
115:	learn: 0.0077992	total: 12.7s	remaining: 19.7s
116:	learn: 0.0077739	total: 12.8s	remaining: 19.6s
117:	learn: 0.0076173	total: 12.9s	remaining: 19.4s
118:	learn: 0.0074957	total: 13s	remaining: 19.3s
119:	learn: 0.0073698	total: 13s	remaining: 19.1s
120:	learn: 0.0073485	total: 13.1s	remaining: 19s
121:	learn: 0.0072691	total: 13.2s	remaining: 18.8s
122:	learn: 0.0072165	total: 13.3s	remaining: 18.7s
123:	learn: 0.0071081	total: 13.4s	remaining: 18.6s
124:	learn: 0.0069873	total: 13.5s	remaining: 18.4s
125:	learn: 0.0068868	total: 13.5s	remaining: 18.3s
126:	learn: 0.0067708	total: 13.6s	remaining: 18.2s
127:	learn: 0.0067326	total: 13.7s	remaining: 18s
128:	learn: 0.0066152	total: 13.8s	remaining: 17.9s
129:	learn: 0.0064839	total: 13.9s	remaining: 17.8s
130:	learn: 0.0064515	total: 14s	remaining: 17.6s
131:	learn: 0.0063907	total: 14.1s	remaining: 17.5s
132:	learn: 0.0063345	total: 14.2s	remaining: 17.4s
133:	learn: 0.0062709	total: 14.2s	remaining: 17.2s
134:	learn: 0.0062439	total: 14.3s	remaining: 17.1s
135:	learn: 0.0061769	total: 14.4s	remaining: 16.9s
136:	learn: 0.0060972	total: 14.5s	remaining: 16.8s
137:	learn: 0.0059913	total: 14.6s	remaining: 16.7s
138:	learn: 0.0058994	total: 14.7s	remaining: 16.6s
139:	learn: 0.0058498	total: 14.8s	remaining: 16.4s
140:	learn: 0.0057863	total: 14.8s	remaining: 16.3s
141:	learn: 0.0057311	total: 14.9s	remaining: 16.2s
142:	learn: 0.0057164	total: 15s	remaining: 16.1s
143:	learn: 0.0056699	total: 15.1s	remaining: 15.9s
144:	learn: 0.0056225	total: 15.2s	remaining: 15.8s
145:	learn: 0.0055170	total: 15.3s	remaining: 15.7s
146:	learn: 0.0054776	total: 15.3s	remaining: 15.5s
147:	learn: 0.0054531	total: 15.4s	remaining: 15.4s
148:	learn: 0.0053828	total: 15.5s	remaining: 15.3s
149:	learn: 0.0053421	total: 15.6s	remaining: 15.2s
150:	learn: 0.0052695	total: 15.7s	remaining: 15.1s
151:	learn: 0.0052361	total: 15.8s	remaining: 14.9s
152:	learn: 0.0052248	total: 15.8s	remaining: 14.8s
153:	learn: 0.0051614	total: 15.9s	remaining: 14.7s
154:	learn: 0.0051169	total: 16s	remaining: 14.6s
155:	learn: 0.0050483	total: 16.1s	remaining: 14.4s
156:	learn: 0.0049566	total: 16.2s	remaining: 14.3s
157:	learn: 0.0048875	total: 16.3s	remaining: 14.2s
158:	learn: 0.0048092	total: 16.3s	remaining: 14.1s
159:	learn: 0.0048091	total: 16.4s	remaining: 14s
160:	learn: 0.0047777	total: 16.5s	remaining: 13.8s
161:	learn: 0.0047776	total: 16.6s	remaining: 13.7s
162:	learn: 0.0047776	total: 16.6s	remaining: 13.6s
163:	learn: 0.0047775	total: 16.7s	remaining: 13.4s
164:	learn: 0.0047775	total: 16.8s	remaining: 13.3s
165:	learn: 0.0047775	total: 16.8s	remaining: 13.2s
166:	learn: 0.0047467	total: 16.9s	remaining: 13.1s
167:	learn: 0.0046696	total: 17s	remaining: 13s
168:	learn: 0.0046696	total: 17.1s	remaining: 12.8s
169:	learn: 0.0046518	total: 17.2s	remaining: 12.7s
170:	learn: 0.0046047	total: 17.3s	remaining: 12.6s
171:	learn: 0.0046047	total: 17.3s	remaining: 12.5s
172:	learn: 0.0045509	total: 17.4s	remaining: 12.4s
173:	learn: 0.0045509	total: 17.5s	remaining: 12.3s
174:	learn: 0.0045342	total: 17.6s	remaining: 12.1s
175:	learn: 0.0045342	total: 17.6s	remaining: 12s
176:	learn: 0.0045035	total: 17.7s	remaining: 11.9s
177:	learn: 0.0044860	total: 17.8s	remaining: 11.8s
178:	learn: 0.0044493	total: 17.9s	remaining: 11.7s
179:	learn: 0.0043856	total: 18s	remaining: 11.6s
180:	learn: 0.0043856	total: 18s	remaining: 11.5s
181:	learn: 0.0043856	total: 18.1s	remaining: 11.3s
182:	learn: 0.0043855	total: 18.1s	remaining: 11.2s
183:	learn: 0.0043855	total: 18.2s	remaining: 11.1s
184:	learn: 0.0043855	total: 18.3s	remaining: 11s
185:	learn: 0.0043855	total: 18.3s	remaining: 10.8s
186:	learn: 0.0043854	total: 18.4s	remaining: 10.7s
187:	learn: 0.0043854	total: 18.5s	remaining: 10.6s
188:	learn: 0.0043854	total: 18.5s	remaining: 10.5s
189:	learn: 0.0043854	total: 18.6s	remaining: 10.4s
190:	learn: 0.0043854	total: 18.7s	remaining: 10.3s
191:	learn: 0.0043854	total: 18.7s	remaining: 10.2s
192:	learn: 0.0043853	total: 18.8s	remaining: 10s
193:	learn: 0.0043853	total: 18.9s	remaining: 9.92s
194:	learn: 0.0043853	total: 18.9s	remaining: 9.81s
195:	learn: 0.0043853	total: 19s	remaining: 9.7s
196:	learn: 0.0043852	total: 19.1s	remaining: 9.59s
197:	learn: 0.0043852	total: 19.1s	remaining: 9.47s
198:	learn: 0.0043852	total: 19.2s	remaining: 9.36s
199:	learn: 0.0043852	total: 19.3s	remaining: 9.26s
200:	learn: 0.0043851	total: 19.3s	remaining: 9.14s
201:	learn: 0.0043852	total: 19.4s	remaining: 9.03s
202:	learn: 0.0043851	total: 19.5s	remaining: 8.92s
203:	learn: 0.0043851	total: 19.5s	remaining: 8.81s
204:	learn: 0.0043851	total: 19.6s	remaining: 8.71s
205:	learn: 0.0043657	total: 19.7s	remaining: 8.6s
206:	learn: 0.0043657	total: 19.7s	remaining: 8.49s
207:	learn: 0.0043657	total: 19.8s	remaining: 8.39s
208:	learn: 0.0043657	total: 19.9s	remaining: 8.27s
209:	learn: 0.0043657	total: 19.9s	remaining: 8.17s
210:	learn: 0.0043657	total: 20s	remaining: 8.06s
211:	learn: 0.0043657	total: 20.1s	remaining: 7.96s
212:	learn: 0.0043657	total: 20.1s	remaining: 7.85s
213:	learn: 0.0043657	total: 20.2s	remaining: 7.74s
214:	learn: 0.0043657	total: 20.3s	remaining: 7.64s
215:	learn: 0.0043657	total: 20.3s	remaining: 7.54s
216:	learn: 0.0043657	total: 20.4s	remaining: 7.43s
217:	learn: 0.0043657	total: 20.5s	remaining: 7.32s
218:	learn: 0.0043657	total: 20.5s	remaining: 7.21s
219:	learn: 0.0043657	total: 20.6s	remaining: 7.12s
220:	learn: 0.0043657	total: 20.7s	remaining: 7.01s
221:	learn: 0.0043657	total: 20.7s	remaining: 6.91s
222:	learn: 0.0043657	total: 20.8s	remaining: 6.8s
223:	learn: 0.0043657	total: 20.9s	remaining: 6.71s
224:	learn: 0.0043657	total: 20.9s	remaining: 6.61s
225:	learn: 0.0043657	total: 21s	remaining: 6.51s
226:	learn: 0.0043657	total: 21.1s	remaining: 6.4s
227:	learn: 0.0043657	total: 21.1s	remaining: 6.3s
228:	learn: 0.0043657	total: 21.2s	remaining: 6.2s
229:	learn: 0.0043657	total: 21.3s	remaining: 6.1s
230:	learn: 0.0043657	total: 21.4s	remaining: 6.01s
231:	learn: 0.0043657	total: 21.5s	remaining: 5.92s
232:	learn: 0.0043657	total: 21.6s	remaining: 5.83s
233:	learn: 0.0043657	total: 21.7s	remaining: 5.75s
234:	learn: 0.0043657	total: 21.8s	remaining: 5.66s
235:	learn: 0.0043657	total: 21.9s	remaining: 5.58s
236:	learn: 0.0043657	total: 22s	remaining: 5.49s
237:	learn: 0.0043657	total: 22.1s	remaining: 5.39s
238:	learn: 0.0043657	total: 22.2s	remaining: 5.3s
239:	learn: 0.0043657	total: 22.3s	remaining: 5.21s
240:	learn: 0.0043657	total: 22.5s	remaining: 5.13s
241:	learn: 0.0043657	total: 22.6s	remaining: 5.04s
242:	learn: 0.0043657	total: 22.7s	remaining: 4.95s
243:	learn: 0.0043657	total: 22.8s	remaining: 4.87s
244:	learn: 0.0043657	total: 23s	remaining: 4.78s
245:	learn: 0.0043657	total: 23.1s	remaining: 4.7s
246:	learn: 0.0043657	total: 23.2s	remaining: 4.61s
247:	learn: 0.0043657	total: 23.3s	remaining: 4.51s
248:	learn: 0.0043657	total: 23.4s	remaining: 4.42s
249:	learn: 0.0043657	total: 23.5s	remaining: 4.33s
250:	learn: 0.0043657	total: 23.7s	remaining: 4.25s
251:	learn: 0.0043656	total: 23.8s	remaining: 4.16s
252:	learn: 0.0043656	total: 23.9s	remaining: 4.07s
253:	learn: 0.0043656	total: 24.1s	remaining: 3.98s
254:	learn: 0.0043656	total: 24.2s	remaining: 3.89s
255:	learn: 0.0043656	total: 24.3s	remaining: 3.8s
256:	learn: 0.0043656	total: 24.5s	remaining: 3.71s
257:	learn: 0.0043656	total: 24.6s	remaining: 3.62s
258:	learn: 0.0043656	total: 24.7s	remaining: 3.53s
259:	learn: 0.0043656	total: 24.8s	remaining: 3.44s
260:	learn: 0.0043656	total: 25s	remaining: 3.35s
261:	learn: 0.0043656	total: 25.1s	remaining: 3.25s
262:	learn: 0.0043656	total: 25.2s	remaining: 3.16s
263:	learn: 0.0043656	total: 25.3s	remaining: 3.07s
264:	learn: 0.0043656	total: 25.4s	remaining: 2.97s
265:	learn: 0.0043656	total: 25.6s	remaining: 2.88s
266:	learn: 0.0043656	total: 25.7s	remaining: 2.79s
267:	learn: 0.0043656	total: 25.8s	remaining: 2.7s
268:	learn: 0.0043656	total: 25.9s	remaining: 2.6s
269:	learn: 0.0043656	total: 26.1s	remaining: 2.51s
270:	learn: 0.0043656	total: 26.2s	remaining: 2.42s
271:	learn: 0.0043656	total: 26.3s	remaining: 2.32s
272:	learn: 0.0043656	total: 26.4s	remaining: 2.23s
273:	learn: 0.0043656	total: 26.5s	remaining: 2.13s
274:	learn: 0.0043655	total: 26.7s	remaining: 2.04s
275:	learn: 0.0043655	total: 26.8s	remaining: 1.94s
276:	learn: 0.0043655	total: 26.9s	remaining: 1.84s
277:	learn: 0.0043655	total: 27s	remaining: 1.75s
278:	learn: 0.0043655	total: 27.1s	remaining: 1.65s
279:	learn: 0.0043655	total: 27.2s	remaining: 1.55s
280:	learn: 0.0043655	total: 27.2s	remaining: 1.45s
281:	learn: 0.0043655	total: 27.3s	remaining: 1.35s
282:	learn: 0.0043655	total: 27.4s	remaining: 1.26s
283:	learn: 0.0043655	total: 27.4s	remaining: 1.16s
284:	learn: 0.0043655	total: 27.5s	remaining: 1.06s
285:	learn: 0.0043655	total: 27.6s	remaining: 964ms
286:	learn: 0.0043654	total: 27.6s	remaining: 867ms
287:	learn: 0.0043654	total: 27.7s	remaining: 770ms
288:	learn: 0.0043654	total: 27.8s	remaining: 672ms
289:	learn: 0.0043654	total: 27.8s	remaining: 576ms
290:	learn: 0.0043654	total: 27.9s	remaining: 479ms
291:	learn: 0.0043654	total: 28s	remaining: 383ms
292:	learn: 0.0043654	total: 28s	remaining: 287ms
293:	learn: 0.0043654	total: 28.1s	remaining: 191ms
294:	learn: 0.0043654	total: 28.2s	remaining: 95.5ms
295:	learn: 0.0043654	total: 28.2s	remaining: 0us
[I 2024-12-19 14:50:32,300] Trial 34 finished with value: 77.09665448747673 and parameters: {'learning_rate': 0.09821891331885638, 'max_depth': 5, 'n_estimators': 296, 'scale_pos_weight': 5.121024209926356}. Best is trial 33 with value: 80.40946207404374.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.62
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.81
 - F1-Score_Train: 99.81
 - Precision_Test: 25.29
 - Recall_Test: 86.51
 - AUPRC_Test: 77.26
 - Accuracy_Test: 99.55
 - F1-Score_Test: 39.14
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 77.0967

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4633701	total: 91.8ms	remaining: 27.2s
1:	learn: 0.3241201	total: 181ms	remaining: 26.7s
2:	learn: 0.2291723	total: 268ms	remaining: 26.3s
3:	learn: 0.1701728	total: 383ms	remaining: 28.1s
4:	learn: 0.1270233	total: 481ms	remaining: 28.1s
5:	learn: 0.1086097	total: 569ms	remaining: 27.6s
6:	learn: 0.0916520	total: 680ms	remaining: 28.2s
7:	learn: 0.0814642	total: 769ms	remaining: 27.8s
8:	learn: 0.0717767	total: 858ms	remaining: 27.4s
9:	learn: 0.0636161	total: 972ms	remaining: 27.9s
10:	learn: 0.0580662	total: 1.08s	remaining: 28.2s
11:	learn: 0.0534046	total: 1.17s	remaining: 27.8s
12:	learn: 0.0486885	total: 1.28s	remaining: 27.9s
13:	learn: 0.0458934	total: 1.37s	remaining: 27.6s
14:	learn: 0.0430969	total: 1.45s	remaining: 27.3s
15:	learn: 0.0396820	total: 1.56s	remaining: 27.4s
16:	learn: 0.0380377	total: 1.65s	remaining: 27.2s
17:	learn: 0.0363607	total: 1.74s	remaining: 27s
18:	learn: 0.0341102	total: 1.88s	remaining: 27.5s
19:	learn: 0.0325832	total: 1.96s	remaining: 27.2s
20:	learn: 0.0314353	total: 2.07s	remaining: 27.2s
21:	learn: 0.0301465	total: 2.17s	remaining: 27.2s
22:	learn: 0.0286363	total: 2.26s	remaining: 27s
23:	learn: 0.0272855	total: 2.35s	remaining: 26.7s
24:	learn: 0.0257032	total: 2.45s	remaining: 26.7s
25:	learn: 0.0249547	total: 2.54s	remaining: 26.5s
26:	learn: 0.0239317	total: 2.62s	remaining: 26.2s
27:	learn: 0.0231973	total: 2.73s	remaining: 26.3s
28:	learn: 0.0225753	total: 2.82s	remaining: 26s
29:	learn: 0.0219271	total: 2.9s	remaining: 25.8s
30:	learn: 0.0211399	total: 3.01s	remaining: 25.8s
31:	learn: 0.0206711	total: 3.12s	remaining: 25.8s
32:	learn: 0.0200169	total: 3.21s	remaining: 25.7s
33:	learn: 0.0193809	total: 3.31s	remaining: 25.6s
34:	learn: 0.0186794	total: 3.39s	remaining: 25.4s
35:	learn: 0.0179925	total: 3.48s	remaining: 25.3s
36:	learn: 0.0171929	total: 3.6s	remaining: 25.3s
37:	learn: 0.0167747	total: 3.69s	remaining: 25.1s
38:	learn: 0.0161876	total: 3.78s	remaining: 25s
39:	learn: 0.0157659	total: 3.88s	remaining: 25s
40:	learn: 0.0150760	total: 3.97s	remaining: 24.8s
41:	learn: 0.0146275	total: 4.06s	remaining: 24.7s
42:	learn: 0.0143007	total: 4.19s	remaining: 24.8s
43:	learn: 0.0139162	total: 4.28s	remaining: 24.6s
44:	learn: 0.0137242	total: 4.36s	remaining: 24.4s
45:	learn: 0.0133648	total: 4.47s	remaining: 24.4s
46:	learn: 0.0131105	total: 4.55s	remaining: 24.2s
47:	learn: 0.0128052	total: 4.64s	remaining: 24.1s
48:	learn: 0.0124020	total: 4.75s	remaining: 24.1s
49:	learn: 0.0120636	total: 4.84s	remaining: 23.9s
50:	learn: 0.0116873	total: 4.92s	remaining: 23.7s
51:	learn: 0.0114242	total: 5.03s	remaining: 23.7s
52:	learn: 0.0111481	total: 5.13s	remaining: 23.6s
53:	learn: 0.0108040	total: 5.22s	remaining: 23.5s
54:	learn: 0.0105778	total: 5.34s	remaining: 23.5s
55:	learn: 0.0100619	total: 5.43s	remaining: 23.4s
56:	learn: 0.0098072	total: 5.52s	remaining: 23.3s
57:	learn: 0.0096063	total: 5.63s	remaining: 23.2s
58:	learn: 0.0092999	total: 5.72s	remaining: 23.1s
59:	learn: 0.0091470	total: 5.8s	remaining: 22.9s
60:	learn: 0.0088333	total: 5.92s	remaining: 22.9s
61:	learn: 0.0086851	total: 6.01s	remaining: 22.8s
62:	learn: 0.0085827	total: 6.09s	remaining: 22.6s
63:	learn: 0.0084362	total: 6.24s	remaining: 22.7s
64:	learn: 0.0083006	total: 6.32s	remaining: 22.6s
65:	learn: 0.0081041	total: 6.42s	remaining: 22.5s
66:	learn: 0.0078922	total: 6.52s	remaining: 22.4s
67:	learn: 0.0077219	total: 6.67s	remaining: 22.4s
68:	learn: 0.0075361	total: 6.82s	remaining: 22.5s
69:	learn: 0.0073008	total: 7s	remaining: 22.7s
70:	learn: 0.0071994	total: 7.14s	remaining: 22.7s
71:	learn: 0.0070360	total: 7.32s	remaining: 22.9s
72:	learn: 0.0069106	total: 7.5s	remaining: 23s
73:	learn: 0.0067656	total: 7.67s	remaining: 23.1s
74:	learn: 0.0067290	total: 7.84s	remaining: 23.2s
75:	learn: 0.0066180	total: 8.02s	remaining: 23.3s
76:	learn: 0.0064298	total: 8.21s	remaining: 23.5s
77:	learn: 0.0063281	total: 8.38s	remaining: 23.5s
78:	learn: 0.0062572	total: 8.54s	remaining: 23.6s
79:	learn: 0.0061981	total: 8.7s	remaining: 23.6s
80:	learn: 0.0061118	total: 8.87s	remaining: 23.7s
81:	learn: 0.0060361	total: 9.03s	remaining: 23.7s
82:	learn: 0.0059753	total: 9.21s	remaining: 23.8s
83:	learn: 0.0058800	total: 9.38s	remaining: 23.8s
84:	learn: 0.0058197	total: 9.55s	remaining: 23.8s
85:	learn: 0.0057324	total: 9.74s	remaining: 23.9s
86:	learn: 0.0056141	total: 9.92s	remaining: 23.9s
87:	learn: 0.0055079	total: 10.1s	remaining: 24s
88:	learn: 0.0053923	total: 10.3s	remaining: 24s
89:	learn: 0.0053346	total: 10.4s	remaining: 24s
90:	learn: 0.0052419	total: 10.6s	remaining: 24s
91:	learn: 0.0051790	total: 10.8s	remaining: 24s
92:	learn: 0.0049961	total: 10.9s	remaining: 24s
93:	learn: 0.0049272	total: 11.1s	remaining: 24s
94:	learn: 0.0048788	total: 11.3s	remaining: 23.9s
95:	learn: 0.0048141	total: 11.4s	remaining: 23.9s
96:	learn: 0.0047543	total: 11.6s	remaining: 24s
97:	learn: 0.0047050	total: 11.8s	remaining: 23.9s
98:	learn: 0.0046489	total: 11.9s	remaining: 23.9s
99:	learn: 0.0045649	total: 12.1s	remaining: 23.9s
100:	learn: 0.0045486	total: 12.2s	remaining: 23.7s
101:	learn: 0.0045009	total: 12.3s	remaining: 23.5s
102:	learn: 0.0044915	total: 12.4s	remaining: 23.3s
103:	learn: 0.0043972	total: 12.5s	remaining: 23.2s
104:	learn: 0.0043465	total: 12.6s	remaining: 23s
105:	learn: 0.0043046	total: 12.7s	remaining: 22.9s
106:	learn: 0.0042859	total: 12.8s	remaining: 22.7s
107:	learn: 0.0041811	total: 12.9s	remaining: 22.5s
108:	learn: 0.0041381	total: 13s	remaining: 22.4s
109:	learn: 0.0040481	total: 13.1s	remaining: 22.2s
110:	learn: 0.0040329	total: 13.2s	remaining: 22s
111:	learn: 0.0040329	total: 13.2s	remaining: 21.8s
112:	learn: 0.0039496	total: 13.3s	remaining: 21.7s
113:	learn: 0.0039303	total: 13.4s	remaining: 21.5s
114:	learn: 0.0038455	total: 13.5s	remaining: 21.4s
115:	learn: 0.0037863	total: 13.6s	remaining: 21.3s
116:	learn: 0.0037379	total: 13.7s	remaining: 21.1s
117:	learn: 0.0037379	total: 13.8s	remaining: 20.9s
118:	learn: 0.0037379	total: 13.8s	remaining: 20.7s
119:	learn: 0.0036950	total: 13.9s	remaining: 20.6s
120:	learn: 0.0036950	total: 14s	remaining: 20.4s
121:	learn: 0.0036950	total: 14.1s	remaining: 20.3s
122:	learn: 0.0036535	total: 14.2s	remaining: 20.1s
123:	learn: 0.0036372	total: 14.3s	remaining: 19.9s
124:	learn: 0.0036371	total: 14.4s	remaining: 19.8s
125:	learn: 0.0036371	total: 14.4s	remaining: 19.6s
126:	learn: 0.0036182	total: 14.5s	remaining: 19.4s
127:	learn: 0.0036034	total: 14.6s	remaining: 19.3s
128:	learn: 0.0035657	total: 14.7s	remaining: 19.2s
129:	learn: 0.0035657	total: 14.8s	remaining: 19s
130:	learn: 0.0035657	total: 14.9s	remaining: 18.8s
131:	learn: 0.0035429	total: 15s	remaining: 18.7s
132:	learn: 0.0035429	total: 15s	remaining: 18.5s
133:	learn: 0.0034765	total: 15.1s	remaining: 18.4s
134:	learn: 0.0034765	total: 15.2s	remaining: 18.2s
135:	learn: 0.0034518	total: 15.3s	remaining: 18.1s
136:	learn: 0.0034240	total: 15.4s	remaining: 18s
137:	learn: 0.0034114	total: 15.5s	remaining: 17.8s
138:	learn: 0.0033306	total: 15.6s	remaining: 17.7s
139:	learn: 0.0032771	total: 15.7s	remaining: 17.6s
140:	learn: 0.0032597	total: 15.8s	remaining: 17.5s
141:	learn: 0.0032441	total: 15.9s	remaining: 17.3s
142:	learn: 0.0032441	total: 15.9s	remaining: 17.2s
143:	learn: 0.0032220	total: 16s	remaining: 17s
144:	learn: 0.0032055	total: 16.1s	remaining: 16.9s
145:	learn: 0.0032055	total: 16.2s	remaining: 16.8s
146:	learn: 0.0031760	total: 16.3s	remaining: 16.6s
147:	learn: 0.0031760	total: 16.4s	remaining: 16.5s
148:	learn: 0.0031456	total: 16.5s	remaining: 16.3s
149:	learn: 0.0031137	total: 16.5s	remaining: 16.2s
150:	learn: 0.0030803	total: 16.7s	remaining: 16.1s
151:	learn: 0.0030546	total: 16.7s	remaining: 16s
152:	learn: 0.0030546	total: 16.8s	remaining: 15.8s
153:	learn: 0.0030244	total: 16.9s	remaining: 15.7s
154:	learn: 0.0030244	total: 17s	remaining: 15.6s
155:	learn: 0.0030185	total: 17.1s	remaining: 15.4s
156:	learn: 0.0029826	total: 17.2s	remaining: 15.3s
157:	learn: 0.0029625	total: 17.2s	remaining: 15.2s
158:	learn: 0.0029438	total: 17.3s	remaining: 15s
159:	learn: 0.0029438	total: 17.4s	remaining: 14.9s
160:	learn: 0.0029438	total: 17.5s	remaining: 14.8s
161:	learn: 0.0029438	total: 17.6s	remaining: 14.6s
162:	learn: 0.0029438	total: 17.6s	remaining: 14.5s
163:	learn: 0.0029438	total: 17.7s	remaining: 14.4s
164:	learn: 0.0029438	total: 17.8s	remaining: 14.2s
165:	learn: 0.0029438	total: 17.9s	remaining: 14.1s
166:	learn: 0.0029438	total: 18s	remaining: 14s
167:	learn: 0.0029438	total: 18s	remaining: 13.8s
168:	learn: 0.0029438	total: 18.1s	remaining: 13.7s
169:	learn: 0.0029438	total: 18.2s	remaining: 13.6s
170:	learn: 0.0029438	total: 18.3s	remaining: 13.5s
171:	learn: 0.0029438	total: 18.3s	remaining: 13.3s
172:	learn: 0.0029438	total: 18.4s	remaining: 13.2s
173:	learn: 0.0029438	total: 18.5s	remaining: 13.1s
174:	learn: 0.0029438	total: 18.6s	remaining: 13s
175:	learn: 0.0029438	total: 18.7s	remaining: 12.8s
176:	learn: 0.0029438	total: 18.7s	remaining: 12.7s
177:	learn: 0.0029438	total: 18.8s	remaining: 12.6s
178:	learn: 0.0029438	total: 18.9s	remaining: 12.5s
179:	learn: 0.0029438	total: 19s	remaining: 12.3s
180:	learn: 0.0029438	total: 19.1s	remaining: 12.2s
181:	learn: 0.0029438	total: 19.2s	remaining: 12.1s
182:	learn: 0.0029438	total: 19.2s	remaining: 12s
183:	learn: 0.0029438	total: 19.3s	remaining: 11.8s
184:	learn: 0.0029438	total: 19.4s	remaining: 11.7s
185:	learn: 0.0029438	total: 19.4s	remaining: 11.6s
186:	learn: 0.0029438	total: 19.5s	remaining: 11.5s
187:	learn: 0.0029438	total: 19.6s	remaining: 11.4s
188:	learn: 0.0029438	total: 19.7s	remaining: 11.3s
189:	learn: 0.0029438	total: 19.8s	remaining: 11.1s
190:	learn: 0.0029438	total: 19.9s	remaining: 11s
191:	learn: 0.0029438	total: 19.9s	remaining: 10.9s
192:	learn: 0.0029438	total: 20s	remaining: 10.8s
193:	learn: 0.0029438	total: 20.1s	remaining: 10.7s
194:	learn: 0.0029438	total: 20.2s	remaining: 10.5s
195:	learn: 0.0029438	total: 20.3s	remaining: 10.4s
196:	learn: 0.0029438	total: 20.3s	remaining: 10.3s
197:	learn: 0.0029438	total: 20.4s	remaining: 10.2s
198:	learn: 0.0029438	total: 20.5s	remaining: 10.1s
199:	learn: 0.0029438	total: 20.6s	remaining: 9.97s
200:	learn: 0.0029438	total: 20.6s	remaining: 9.85s
201:	learn: 0.0029438	total: 20.7s	remaining: 9.74s
202:	learn: 0.0029438	total: 20.8s	remaining: 9.63s
203:	learn: 0.0029438	total: 20.9s	remaining: 9.51s
204:	learn: 0.0029438	total: 21s	remaining: 9.41s
205:	learn: 0.0029438	total: 21s	remaining: 9.29s
206:	learn: 0.0029438	total: 21.1s	remaining: 9.18s
207:	learn: 0.0029438	total: 21.2s	remaining: 9.07s
208:	learn: 0.0029438	total: 21.3s	remaining: 8.95s
209:	learn: 0.0029438	total: 21.3s	remaining: 8.84s
210:	learn: 0.0029438	total: 21.4s	remaining: 8.73s
211:	learn: 0.0029438	total: 21.5s	remaining: 8.63s
212:	learn: 0.0029438	total: 21.6s	remaining: 8.51s
213:	learn: 0.0029438	total: 21.7s	remaining: 8.41s
214:	learn: 0.0029438	total: 21.7s	remaining: 8.29s
215:	learn: 0.0029438	total: 21.8s	remaining: 8.19s
216:	learn: 0.0029438	total: 21.9s	remaining: 8.08s
217:	learn: 0.0029438	total: 22s	remaining: 7.97s
218:	learn: 0.0029438	total: 22.1s	remaining: 7.86s
219:	learn: 0.0029438	total: 22.1s	remaining: 7.75s
220:	learn: 0.0029438	total: 22.3s	remaining: 7.65s
221:	learn: 0.0029438	total: 22.4s	remaining: 7.56s
222:	learn: 0.0029438	total: 22.5s	remaining: 7.48s
223:	learn: 0.0029438	total: 22.7s	remaining: 7.39s
224:	learn: 0.0029438	total: 22.8s	remaining: 7.3s
225:	learn: 0.0029438	total: 22.9s	remaining: 7.21s
226:	learn: 0.0029438	total: 23.1s	remaining: 7.12s
227:	learn: 0.0029438	total: 23.2s	remaining: 7.03s
228:	learn: 0.0029438	total: 23.4s	remaining: 6.93s
229:	learn: 0.0029438	total: 23.5s	remaining: 6.85s
230:	learn: 0.0029438	total: 23.6s	remaining: 6.76s
231:	learn: 0.0029438	total: 23.8s	remaining: 6.67s
232:	learn: 0.0029438	total: 23.9s	remaining: 6.58s
233:	learn: 0.0029438	total: 24.1s	remaining: 6.48s
234:	learn: 0.0029438	total: 24.2s	remaining: 6.39s
235:	learn: 0.0029438	total: 24.3s	remaining: 6.29s
236:	learn: 0.0029438	total: 24.5s	remaining: 6.2s
237:	learn: 0.0029438	total: 24.6s	remaining: 6.11s
238:	learn: 0.0029438	total: 24.8s	remaining: 6.01s
239:	learn: 0.0029438	total: 24.9s	remaining: 5.92s
240:	learn: 0.0029438	total: 25.1s	remaining: 5.83s
241:	learn: 0.0029438	total: 25.2s	remaining: 5.73s
242:	learn: 0.0029438	total: 25.4s	remaining: 5.63s
243:	learn: 0.0029438	total: 25.5s	remaining: 5.54s
244:	learn: 0.0029438	total: 25.7s	remaining: 5.45s
245:	learn: 0.0029438	total: 25.8s	remaining: 5.35s
246:	learn: 0.0029438	total: 25.9s	remaining: 5.25s
247:	learn: 0.0029438	total: 26.1s	remaining: 5.15s
248:	learn: 0.0029438	total: 26.2s	remaining: 5.05s
249:	learn: 0.0029438	total: 26.4s	remaining: 4.96s
250:	learn: 0.0029438	total: 26.5s	remaining: 4.85s
251:	learn: 0.0029438	total: 26.6s	remaining: 4.76s
252:	learn: 0.0029438	total: 26.8s	remaining: 4.66s
253:	learn: 0.0029438	total: 26.9s	remaining: 4.56s
254:	learn: 0.0029438	total: 27.1s	remaining: 4.46s
255:	learn: 0.0029438	total: 27.2s	remaining: 4.36s
256:	learn: 0.0029438	total: 27.3s	remaining: 4.25s
257:	learn: 0.0029438	total: 27.5s	remaining: 4.15s
258:	learn: 0.0029438	total: 27.6s	remaining: 4.05s
259:	learn: 0.0029438	total: 27.8s	remaining: 3.95s
260:	learn: 0.0029438	total: 27.8s	remaining: 3.84s
261:	learn: 0.0029438	total: 27.9s	remaining: 3.73s
262:	learn: 0.0029438	total: 28s	remaining: 3.62s
263:	learn: 0.0029438	total: 28.1s	remaining: 3.51s
264:	learn: 0.0029438	total: 28.2s	remaining: 3.4s
265:	learn: 0.0029438	total: 28.2s	remaining: 3.29s
266:	learn: 0.0029438	total: 28.3s	remaining: 3.18s
267:	learn: 0.0029438	total: 28.4s	remaining: 3.07s
268:	learn: 0.0029438	total: 28.5s	remaining: 2.96s
269:	learn: 0.0029438	total: 28.5s	remaining: 2.85s
270:	learn: 0.0029438	total: 28.6s	remaining: 2.75s
271:	learn: 0.0029438	total: 28.7s	remaining: 2.64s
272:	learn: 0.0029438	total: 28.8s	remaining: 2.53s
273:	learn: 0.0029438	total: 28.9s	remaining: 2.42s
274:	learn: 0.0029438	total: 28.9s	remaining: 2.31s
275:	learn: 0.0029438	total: 29s	remaining: 2.21s
276:	learn: 0.0029438	total: 29.1s	remaining: 2.1s
277:	learn: 0.0029438	total: 29.2s	remaining: 2s
278:	learn: 0.0029438	total: 29.3s	remaining: 1.89s
279:	learn: 0.0029438	total: 29.4s	remaining: 1.78s
280:	learn: 0.0029438	total: 29.4s	remaining: 1.68s
281:	learn: 0.0029438	total: 29.5s	remaining: 1.57s
282:	learn: 0.0029438	total: 29.6s	remaining: 1.46s
283:	learn: 0.0029438	total: 29.7s	remaining: 1.36s
284:	learn: 0.0029438	total: 29.7s	remaining: 1.25s
285:	learn: 0.0029438	total: 29.8s	remaining: 1.15s
286:	learn: 0.0029438	total: 29.9s	remaining: 1.04s
287:	learn: 0.0029438	total: 30s	remaining: 937ms
288:	learn: 0.0029438	total: 30s	remaining: 832ms
289:	learn: 0.0029438	total: 30.1s	remaining: 728ms
290:	learn: 0.0029438	total: 30.2s	remaining: 623ms
291:	learn: 0.0029438	total: 30.3s	remaining: 519ms
292:	learn: 0.0029438	total: 30.4s	remaining: 415ms
293:	learn: 0.0029438	total: 30.5s	remaining: 311ms
294:	learn: 0.0029438	total: 30.5s	remaining: 207ms
295:	learn: 0.0029438	total: 30.6s	remaining: 103ms
296:	learn: 0.0029438	total: 30.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.74
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.87
 - F1-Score_Train: 99.87
 - Precision_Test: 28.73
 - Recall_Test: 84.13
 - AUPRC_Test: 80.75
 - Accuracy_Test: 99.62
 - F1-Score_Test: 42.83
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 297
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.16
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4833543	total: 91.1ms	remaining: 27s
1:	learn: 0.3266221	total: 186ms	remaining: 27.4s
2:	learn: 0.2339829	total: 284ms	remaining: 27.8s
3:	learn: 0.1863905	total: 389ms	remaining: 28.5s
4:	learn: 0.1546923	total: 493ms	remaining: 28.8s
5:	learn: 0.1327410	total: 581ms	remaining: 28.2s
6:	learn: 0.1163107	total: 697ms	remaining: 28.9s
7:	learn: 0.1059054	total: 782ms	remaining: 28.3s
8:	learn: 0.0964887	total: 864ms	remaining: 27.6s
9:	learn: 0.0890106	total: 973ms	remaining: 27.9s
10:	learn: 0.0815518	total: 1.06s	remaining: 27.6s
11:	learn: 0.0760402	total: 1.15s	remaining: 27.3s
12:	learn: 0.0701045	total: 1.29s	remaining: 28.1s
13:	learn: 0.0667301	total: 1.37s	remaining: 27.7s
14:	learn: 0.0625834	total: 1.46s	remaining: 27.4s
15:	learn: 0.0588092	total: 1.59s	remaining: 27.9s
16:	learn: 0.0565826	total: 1.67s	remaining: 27.6s
17:	learn: 0.0543185	total: 1.76s	remaining: 27.3s
18:	learn: 0.0513409	total: 1.88s	remaining: 27.4s
19:	learn: 0.0490439	total: 1.96s	remaining: 27.1s
20:	learn: 0.0470865	total: 2.04s	remaining: 26.9s
21:	learn: 0.0451499	total: 2.16s	remaining: 27s
22:	learn: 0.0433495	total: 2.24s	remaining: 26.7s
23:	learn: 0.0416005	total: 2.33s	remaining: 26.5s
24:	learn: 0.0402213	total: 2.43s	remaining: 26.5s
25:	learn: 0.0390840	total: 2.52s	remaining: 26.3s
26:	learn: 0.0375316	total: 2.62s	remaining: 26.2s
27:	learn: 0.0363372	total: 2.73s	remaining: 26.2s
28:	learn: 0.0350802	total: 2.82s	remaining: 26s
29:	learn: 0.0333544	total: 2.9s	remaining: 25.8s
30:	learn: 0.0314632	total: 3.02s	remaining: 25.9s
31:	learn: 0.0299255	total: 3.12s	remaining: 25.9s
32:	learn: 0.0286607	total: 3.21s	remaining: 25.6s
33:	learn: 0.0277664	total: 3.31s	remaining: 25.6s
34:	learn: 0.0268359	total: 3.4s	remaining: 25.5s
35:	learn: 0.0259222	total: 3.5s	remaining: 25.3s
36:	learn: 0.0251513	total: 3.62s	remaining: 25.4s
37:	learn: 0.0243522	total: 3.7s	remaining: 25.2s
38:	learn: 0.0235473	total: 3.79s	remaining: 25.1s
39:	learn: 0.0227309	total: 3.9s	remaining: 25.1s
40:	learn: 0.0217695	total: 4s	remaining: 25s
41:	learn: 0.0212730	total: 4.09s	remaining: 24.8s
42:	learn: 0.0206064	total: 4.2s	remaining: 24.8s
43:	learn: 0.0203080	total: 4.28s	remaining: 24.6s
44:	learn: 0.0197618	total: 4.36s	remaining: 24.4s
45:	learn: 0.0193126	total: 4.49s	remaining: 24.5s
46:	learn: 0.0188960	total: 4.58s	remaining: 24.3s
47:	learn: 0.0184375	total: 4.68s	remaining: 24.3s
48:	learn: 0.0178575	total: 4.79s	remaining: 24.2s
49:	learn: 0.0175886	total: 4.87s	remaining: 24.1s
50:	learn: 0.0171706	total: 4.99s	remaining: 24.1s
51:	learn: 0.0169200	total: 5.14s	remaining: 24.2s
52:	learn: 0.0165703	total: 5.3s	remaining: 24.4s
53:	learn: 0.0161621	total: 5.47s	remaining: 24.6s
54:	learn: 0.0160155	total: 5.63s	remaining: 24.8s
55:	learn: 0.0156388	total: 5.81s	remaining: 25s
56:	learn: 0.0152465	total: 5.97s	remaining: 25.1s
57:	learn: 0.0149328	total: 6.17s	remaining: 25.4s
58:	learn: 0.0146167	total: 6.35s	remaining: 25.6s
59:	learn: 0.0142145	total: 6.53s	remaining: 25.8s
60:	learn: 0.0138040	total: 6.7s	remaining: 25.9s
61:	learn: 0.0134229	total: 6.88s	remaining: 26.1s
62:	learn: 0.0130341	total: 7.05s	remaining: 26.2s
63:	learn: 0.0126261	total: 7.26s	remaining: 26.4s
64:	learn: 0.0124282	total: 7.43s	remaining: 26.5s
65:	learn: 0.0122365	total: 7.6s	remaining: 26.6s
66:	learn: 0.0118938	total: 7.78s	remaining: 26.7s
67:	learn: 0.0116486	total: 7.95s	remaining: 26.8s
68:	learn: 0.0115207	total: 8.09s	remaining: 26.7s
69:	learn: 0.0113033	total: 8.28s	remaining: 26.9s
70:	learn: 0.0110876	total: 8.45s	remaining: 26.9s
71:	learn: 0.0109549	total: 8.64s	remaining: 27s
72:	learn: 0.0107823	total: 8.79s	remaining: 27s
73:	learn: 0.0105088	total: 8.98s	remaining: 27.1s
74:	learn: 0.0102748	total: 9.16s	remaining: 27.1s
75:	learn: 0.0100925	total: 9.32s	remaining: 27.1s
76:	learn: 0.0098700	total: 9.48s	remaining: 27.1s
77:	learn: 0.0097544	total: 9.64s	remaining: 27.1s
78:	learn: 0.0094866	total: 9.82s	remaining: 27.1s
79:	learn: 0.0092291	total: 10s	remaining: 27.1s
80:	learn: 0.0090649	total: 10.2s	remaining: 27.1s
81:	learn: 0.0089506	total: 10.3s	remaining: 27.1s
82:	learn: 0.0088955	total: 10.4s	remaining: 26.9s
83:	learn: 0.0087177	total: 10.5s	remaining: 26.7s
84:	learn: 0.0085355	total: 10.6s	remaining: 26.5s
85:	learn: 0.0083883	total: 10.7s	remaining: 26.3s
86:	learn: 0.0082552	total: 10.8s	remaining: 26.1s
87:	learn: 0.0081359	total: 10.9s	remaining: 25.9s
88:	learn: 0.0080190	total: 11s	remaining: 25.7s
89:	learn: 0.0079623	total: 11.1s	remaining: 25.5s
90:	learn: 0.0077801	total: 11.2s	remaining: 25.3s
91:	learn: 0.0076827	total: 11.3s	remaining: 25.1s
92:	learn: 0.0075854	total: 11.3s	remaining: 24.9s
93:	learn: 0.0075194	total: 11.5s	remaining: 24.7s
94:	learn: 0.0074603	total: 11.5s	remaining: 24.5s
95:	learn: 0.0073150	total: 11.6s	remaining: 24.3s
96:	learn: 0.0072060	total: 11.7s	remaining: 24.2s
97:	learn: 0.0071404	total: 11.8s	remaining: 24s
98:	learn: 0.0069916	total: 11.9s	remaining: 23.8s
99:	learn: 0.0069420	total: 12s	remaining: 23.6s
100:	learn: 0.0068710	total: 12.1s	remaining: 23.4s
101:	learn: 0.0067673	total: 12.2s	remaining: 23.3s
102:	learn: 0.0066693	total: 12.3s	remaining: 23.1s
103:	learn: 0.0065127	total: 12.4s	remaining: 22.9s
104:	learn: 0.0064162	total: 12.4s	remaining: 22.8s
105:	learn: 0.0063120	total: 12.5s	remaining: 22.6s
106:	learn: 0.0062738	total: 12.6s	remaining: 22.4s
107:	learn: 0.0062525	total: 12.7s	remaining: 22.2s
108:	learn: 0.0061527	total: 12.8s	remaining: 22.1s
109:	learn: 0.0060660	total: 12.9s	remaining: 21.9s
110:	learn: 0.0060193	total: 13s	remaining: 21.8s
111:	learn: 0.0059658	total: 13.1s	remaining: 21.7s
112:	learn: 0.0058808	total: 13.2s	remaining: 21.5s
113:	learn: 0.0057338	total: 13.3s	remaining: 21.3s
114:	learn: 0.0056510	total: 13.4s	remaining: 21.2s
115:	learn: 0.0055361	total: 13.5s	remaining: 21.1s
116:	learn: 0.0055166	total: 13.6s	remaining: 20.9s
117:	learn: 0.0054669	total: 13.7s	remaining: 20.8s
118:	learn: 0.0053354	total: 13.8s	remaining: 20.6s
119:	learn: 0.0052818	total: 13.9s	remaining: 20.4s
120:	learn: 0.0052204	total: 14s	remaining: 20.3s
121:	learn: 0.0051790	total: 14.1s	remaining: 20.2s
122:	learn: 0.0050933	total: 14.2s	remaining: 20s
123:	learn: 0.0050933	total: 14.2s	remaining: 19.9s
124:	learn: 0.0050556	total: 14.3s	remaining: 19.7s
125:	learn: 0.0049946	total: 14.4s	remaining: 19.6s
126:	learn: 0.0049285	total: 14.5s	remaining: 19.5s
127:	learn: 0.0048826	total: 14.6s	remaining: 19.3s
128:	learn: 0.0047862	total: 14.7s	remaining: 19.2s
129:	learn: 0.0046854	total: 14.8s	remaining: 19.1s
130:	learn: 0.0046591	total: 14.9s	remaining: 18.9s
131:	learn: 0.0046486	total: 15s	remaining: 18.7s
132:	learn: 0.0045886	total: 15.1s	remaining: 18.6s
133:	learn: 0.0045478	total: 15.2s	remaining: 18.5s
134:	learn: 0.0045193	total: 15.3s	remaining: 18.3s
135:	learn: 0.0044456	total: 15.4s	remaining: 18.2s
136:	learn: 0.0044106	total: 15.5s	remaining: 18.1s
137:	learn: 0.0043493	total: 15.6s	remaining: 17.9s
138:	learn: 0.0042914	total: 15.7s	remaining: 17.8s
139:	learn: 0.0042138	total: 15.8s	remaining: 17.7s
140:	learn: 0.0041989	total: 15.8s	remaining: 17.5s
141:	learn: 0.0041685	total: 15.9s	remaining: 17.4s
142:	learn: 0.0041061	total: 16s	remaining: 17.3s
143:	learn: 0.0040087	total: 16.1s	remaining: 17.1s
144:	learn: 0.0039991	total: 16.2s	remaining: 17s
145:	learn: 0.0039655	total: 16.3s	remaining: 16.9s
146:	learn: 0.0039297	total: 16.4s	remaining: 16.7s
147:	learn: 0.0039045	total: 16.5s	remaining: 16.6s
148:	learn: 0.0038902	total: 16.6s	remaining: 16.5s
149:	learn: 0.0038369	total: 16.7s	remaining: 16.4s
150:	learn: 0.0038069	total: 16.8s	remaining: 16.3s
151:	learn: 0.0038069	total: 16.9s	remaining: 16.1s
152:	learn: 0.0037826	total: 17s	remaining: 16s
153:	learn: 0.0037218	total: 17.1s	remaining: 15.9s
154:	learn: 0.0037218	total: 17.1s	remaining: 15.7s
155:	learn: 0.0037218	total: 17.2s	remaining: 15.6s
156:	learn: 0.0037079	total: 17.3s	remaining: 15.4s
157:	learn: 0.0036812	total: 17.4s	remaining: 15.3s
158:	learn: 0.0036274	total: 17.5s	remaining: 15.2s
159:	learn: 0.0035898	total: 17.6s	remaining: 15.1s
160:	learn: 0.0035148	total: 17.7s	remaining: 14.9s
161:	learn: 0.0034666	total: 17.8s	remaining: 14.8s
162:	learn: 0.0034545	total: 17.9s	remaining: 14.7s
163:	learn: 0.0034337	total: 18s	remaining: 14.6s
164:	learn: 0.0034337	total: 18s	remaining: 14.4s
165:	learn: 0.0034337	total: 18.1s	remaining: 14.3s
166:	learn: 0.0033505	total: 18.2s	remaining: 14.2s
167:	learn: 0.0033392	total: 18.3s	remaining: 14.1s
168:	learn: 0.0032837	total: 18.4s	remaining: 14s
169:	learn: 0.0032837	total: 18.5s	remaining: 13.8s
170:	learn: 0.0032667	total: 18.6s	remaining: 13.7s
171:	learn: 0.0032354	total: 18.7s	remaining: 13.6s
172:	learn: 0.0032253	total: 18.8s	remaining: 13.4s
173:	learn: 0.0031920	total: 18.9s	remaining: 13.3s
174:	learn: 0.0031570	total: 19s	remaining: 13.2s
175:	learn: 0.0031570	total: 19s	remaining: 13.1s
176:	learn: 0.0031440	total: 19.1s	remaining: 13s
177:	learn: 0.0031019	total: 19.2s	remaining: 12.9s
178:	learn: 0.0030815	total: 19.3s	remaining: 12.7s
179:	learn: 0.0030814	total: 19.4s	remaining: 12.6s
180:	learn: 0.0030814	total: 19.5s	remaining: 12.5s
181:	learn: 0.0030814	total: 19.6s	remaining: 12.4s
182:	learn: 0.0030814	total: 19.6s	remaining: 12.2s
183:	learn: 0.0030701	total: 19.7s	remaining: 12.1s
184:	learn: 0.0030701	total: 19.8s	remaining: 12s
185:	learn: 0.0030701	total: 19.9s	remaining: 11.9s
186:	learn: 0.0030701	total: 20s	remaining: 11.8s
187:	learn: 0.0030701	total: 20.1s	remaining: 11.6s
188:	learn: 0.0030701	total: 20.1s	remaining: 11.5s
189:	learn: 0.0030701	total: 20.2s	remaining: 11.4s
190:	learn: 0.0030701	total: 20.3s	remaining: 11.3s
191:	learn: 0.0030701	total: 20.4s	remaining: 11.2s
192:	learn: 0.0030700	total: 20.6s	remaining: 11.1s
193:	learn: 0.0030700	total: 20.7s	remaining: 11s
194:	learn: 0.0030700	total: 20.8s	remaining: 10.9s
195:	learn: 0.0030701	total: 21s	remaining: 10.8s
196:	learn: 0.0030700	total: 21.1s	remaining: 10.7s
197:	learn: 0.0030700	total: 21.2s	remaining: 10.6s
198:	learn: 0.0030700	total: 21.4s	remaining: 10.5s
199:	learn: 0.0030700	total: 21.5s	remaining: 10.4s
200:	learn: 0.0030700	total: 21.6s	remaining: 10.3s
201:	learn: 0.0030700	total: 21.8s	remaining: 10.2s
202:	learn: 0.0030700	total: 21.9s	remaining: 10.2s
203:	learn: 0.0030700	total: 22.1s	remaining: 10.1s
204:	learn: 0.0030700	total: 22.2s	remaining: 9.97s
205:	learn: 0.0030700	total: 22.3s	remaining: 9.86s
206:	learn: 0.0030700	total: 22.5s	remaining: 9.77s
207:	learn: 0.0030700	total: 22.6s	remaining: 9.67s
208:	learn: 0.0030700	total: 22.8s	remaining: 9.58s
209:	learn: 0.0030700	total: 22.9s	remaining: 9.47s
210:	learn: 0.0030700	total: 23s	remaining: 9.37s
211:	learn: 0.0030700	total: 23.1s	remaining: 9.27s
212:	learn: 0.0030700	total: 23.3s	remaining: 9.18s
213:	learn: 0.0030700	total: 23.4s	remaining: 9.07s
214:	learn: 0.0030700	total: 23.6s	remaining: 8.98s
215:	learn: 0.0030700	total: 23.7s	remaining: 8.89s
216:	learn: 0.0030700	total: 23.8s	remaining: 8.79s
217:	learn: 0.0030700	total: 24s	remaining: 8.69s
218:	learn: 0.0030699	total: 24.1s	remaining: 8.59s
219:	learn: 0.0030699	total: 24.2s	remaining: 8.48s
220:	learn: 0.0030699	total: 24.4s	remaining: 8.38s
221:	learn: 0.0030699	total: 24.5s	remaining: 8.27s
222:	learn: 0.0030699	total: 24.6s	remaining: 8.17s
223:	learn: 0.0030699	total: 24.8s	remaining: 8.07s
224:	learn: 0.0030699	total: 24.9s	remaining: 7.96s
225:	learn: 0.0030699	total: 25s	remaining: 7.86s
226:	learn: 0.0030699	total: 25.2s	remaining: 7.76s
227:	learn: 0.0030699	total: 25.3s	remaining: 7.66s
228:	learn: 0.0030699	total: 25.4s	remaining: 7.55s
229:	learn: 0.0030699	total: 25.6s	remaining: 7.45s
230:	learn: 0.0030698	total: 25.7s	remaining: 7.35s
231:	learn: 0.0030698	total: 25.9s	remaining: 7.24s
232:	learn: 0.0030698	total: 26s	remaining: 7.14s
233:	learn: 0.0030698	total: 26.1s	remaining: 7.02s
234:	learn: 0.0030698	total: 26.1s	remaining: 6.9s
235:	learn: 0.0030698	total: 26.2s	remaining: 6.78s
236:	learn: 0.0030698	total: 26.3s	remaining: 6.66s
237:	learn: 0.0030698	total: 26.4s	remaining: 6.54s
238:	learn: 0.0030698	total: 26.4s	remaining: 6.42s
239:	learn: 0.0030698	total: 26.5s	remaining: 6.3s
240:	learn: 0.0030698	total: 26.6s	remaining: 6.18s
241:	learn: 0.0030698	total: 26.7s	remaining: 6.06s
242:	learn: 0.0030697	total: 26.8s	remaining: 5.95s
243:	learn: 0.0030697	total: 26.8s	remaining: 5.83s
244:	learn: 0.0030697	total: 26.9s	remaining: 5.71s
245:	learn: 0.0030697	total: 27s	remaining: 5.6s
246:	learn: 0.0030697	total: 27.1s	remaining: 5.48s
247:	learn: 0.0030697	total: 27.1s	remaining: 5.36s
248:	learn: 0.0030697	total: 27.2s	remaining: 5.25s
249:	learn: 0.0030697	total: 27.3s	remaining: 5.13s
250:	learn: 0.0030697	total: 27.4s	remaining: 5.02s
251:	learn: 0.0030697	total: 27.5s	remaining: 4.9s
252:	learn: 0.0030697	total: 27.5s	remaining: 4.79s
253:	learn: 0.0030697	total: 27.6s	remaining: 4.67s
254:	learn: 0.0030697	total: 27.7s	remaining: 4.56s
255:	learn: 0.0030697	total: 27.8s	remaining: 4.45s
256:	learn: 0.0030697	total: 27.9s	remaining: 4.34s
257:	learn: 0.0030697	total: 27.9s	remaining: 4.22s
258:	learn: 0.0030697	total: 28s	remaining: 4.11s
259:	learn: 0.0030697	total: 28.1s	remaining: 4s
260:	learn: 0.0030697	total: 28.2s	remaining: 3.89s
261:	learn: 0.0030697	total: 28.3s	remaining: 3.77s
262:	learn: 0.0030697	total: 28.3s	remaining: 3.66s
263:	learn: 0.0030697	total: 28.4s	remaining: 3.55s
264:	learn: 0.0030697	total: 28.5s	remaining: 3.44s
265:	learn: 0.0030697	total: 28.6s	remaining: 3.33s
266:	learn: 0.0030696	total: 28.6s	remaining: 3.22s
267:	learn: 0.0030697	total: 28.7s	remaining: 3.11s
268:	learn: 0.0030696	total: 28.8s	remaining: 3s
269:	learn: 0.0030696	total: 28.9s	remaining: 2.89s
270:	learn: 0.0030696	total: 29s	remaining: 2.78s
271:	learn: 0.0030696	total: 29.1s	remaining: 2.67s
272:	learn: 0.0030695	total: 29.1s	remaining: 2.56s
273:	learn: 0.0030696	total: 29.2s	remaining: 2.45s
274:	learn: 0.0030695	total: 29.3s	remaining: 2.34s
275:	learn: 0.0030695	total: 29.4s	remaining: 2.23s
276:	learn: 0.0030695	total: 29.4s	remaining: 2.12s
277:	learn: 0.0030695	total: 29.5s	remaining: 2.02s
278:	learn: 0.0030695	total: 29.6s	remaining: 1.91s
279:	learn: 0.0030694	total: 29.7s	remaining: 1.8s
280:	learn: 0.0030694	total: 29.7s	remaining: 1.69s
281:	learn: 0.0030694	total: 29.8s	remaining: 1.58s
282:	learn: 0.0030694	total: 29.9s	remaining: 1.48s
283:	learn: 0.0030694	total: 30s	remaining: 1.37s
284:	learn: 0.0030694	total: 30.1s	remaining: 1.26s
285:	learn: 0.0030694	total: 30.1s	remaining: 1.16s
286:	learn: 0.0030694	total: 30.2s	remaining: 1.05s
287:	learn: 0.0030694	total: 30.3s	remaining: 947ms
288:	learn: 0.0030694	total: 30.4s	remaining: 840ms
289:	learn: 0.0030694	total: 30.4s	remaining: 735ms
290:	learn: 0.0030694	total: 30.5s	remaining: 629ms
291:	learn: 0.0030694	total: 30.6s	remaining: 524ms
292:	learn: 0.0030694	total: 30.7s	remaining: 418ms
293:	learn: 0.0030694	total: 30.7s	remaining: 314ms
294:	learn: 0.0030694	total: 30.8s	remaining: 209ms
295:	learn: 0.0030694	total: 30.9s	remaining: 104ms
296:	learn: 0.0030694	total: 31s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.78
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.89
 - F1-Score_Train: 99.89
 - Precision_Test: 34.69
 - Recall_Test: 88.10
 - AUPRC_Test: 78.36
 - Accuracy_Test: 99.70
 - F1-Score_Test: 49.78
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 297
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.16
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4369328	total: 87ms	remaining: 25.8s
1:	learn: 0.3203027	total: 175ms	remaining: 25.7s
2:	learn: 0.2433983	total: 256ms	remaining: 25.1s
3:	learn: 0.1825964	total: 365ms	remaining: 26.7s
4:	learn: 0.1458736	total: 455ms	remaining: 26.6s
5:	learn: 0.1262440	total: 542ms	remaining: 26.3s
6:	learn: 0.1079547	total: 654ms	remaining: 27.1s
7:	learn: 0.0994471	total: 735ms	remaining: 26.5s
8:	learn: 0.0917251	total: 835ms	remaining: 26.7s
9:	learn: 0.0851902	total: 947ms	remaining: 27.2s
10:	learn: 0.0762288	total: 1.04s	remaining: 27.1s
11:	learn: 0.0704990	total: 1.13s	remaining: 26.8s
12:	learn: 0.0658437	total: 1.25s	remaining: 27.3s
13:	learn: 0.0621755	total: 1.33s	remaining: 27s
14:	learn: 0.0583873	total: 1.43s	remaining: 26.9s
15:	learn: 0.0563130	total: 1.53s	remaining: 26.9s
16:	learn: 0.0530261	total: 1.62s	remaining: 26.7s
17:	learn: 0.0504181	total: 1.71s	remaining: 26.6s
18:	learn: 0.0482690	total: 1.84s	remaining: 26.9s
19:	learn: 0.0459522	total: 1.94s	remaining: 26.8s
20:	learn: 0.0441192	total: 2.02s	remaining: 26.6s
21:	learn: 0.0416230	total: 2.14s	remaining: 26.8s
22:	learn: 0.0395838	total: 2.24s	remaining: 26.7s
23:	learn: 0.0377439	total: 2.33s	remaining: 26.5s
24:	learn: 0.0365807	total: 2.44s	remaining: 26.5s
25:	learn: 0.0350039	total: 2.54s	remaining: 26.5s
26:	learn: 0.0333375	total: 2.63s	remaining: 26.3s
27:	learn: 0.0326136	total: 2.74s	remaining: 26.3s
28:	learn: 0.0316731	total: 2.84s	remaining: 26.3s
29:	learn: 0.0302955	total: 3.03s	remaining: 26.9s
30:	learn: 0.0295027	total: 3.21s	remaining: 27.5s
31:	learn: 0.0284379	total: 3.38s	remaining: 28s
32:	learn: 0.0273827	total: 3.53s	remaining: 28.2s
33:	learn: 0.0263145	total: 3.7s	remaining: 28.6s
34:	learn: 0.0251271	total: 3.88s	remaining: 29.1s
35:	learn: 0.0246347	total: 4.04s	remaining: 29.3s
36:	learn: 0.0234329	total: 4.22s	remaining: 29.6s
37:	learn: 0.0229228	total: 4.38s	remaining: 29.9s
38:	learn: 0.0224479	total: 4.55s	remaining: 30.1s
39:	learn: 0.0219025	total: 4.72s	remaining: 30.4s
40:	learn: 0.0212353	total: 4.88s	remaining: 30.5s
41:	learn: 0.0205429	total: 5.07s	remaining: 30.8s
42:	learn: 0.0196619	total: 5.24s	remaining: 31s
43:	learn: 0.0190481	total: 5.43s	remaining: 31.2s
44:	learn: 0.0183367	total: 5.62s	remaining: 31.5s
45:	learn: 0.0176521	total: 5.81s	remaining: 31.7s
46:	learn: 0.0171876	total: 5.96s	remaining: 31.7s
47:	learn: 0.0166151	total: 6.14s	remaining: 31.9s
48:	learn: 0.0162354	total: 6.31s	remaining: 32s
49:	learn: 0.0157548	total: 6.49s	remaining: 32s
50:	learn: 0.0152104	total: 6.67s	remaining: 32.2s
51:	learn: 0.0150045	total: 6.84s	remaining: 32.2s
52:	learn: 0.0148266	total: 7.01s	remaining: 32.3s
53:	learn: 0.0142568	total: 7.18s	remaining: 32.3s
54:	learn: 0.0138300	total: 7.37s	remaining: 32.4s
55:	learn: 0.0135130	total: 7.55s	remaining: 32.5s
56:	learn: 0.0132721	total: 7.72s	remaining: 32.5s
57:	learn: 0.0129045	total: 7.91s	remaining: 32.6s
58:	learn: 0.0125337	total: 8.09s	remaining: 32.6s
59:	learn: 0.0124334	total: 8.25s	remaining: 32.6s
60:	learn: 0.0122589	total: 8.41s	remaining: 32.5s
61:	learn: 0.0120040	total: 8.6s	remaining: 32.6s
62:	learn: 0.0116414	total: 8.71s	remaining: 32.4s
63:	learn: 0.0113406	total: 8.84s	remaining: 32.2s
64:	learn: 0.0112041	total: 8.93s	remaining: 31.9s
65:	learn: 0.0108902	total: 9.02s	remaining: 31.6s
66:	learn: 0.0106664	total: 9.12s	remaining: 31.3s
67:	learn: 0.0105476	total: 9.21s	remaining: 31s
68:	learn: 0.0104060	total: 9.3s	remaining: 30.7s
69:	learn: 0.0101084	total: 9.42s	remaining: 30.5s
70:	learn: 0.0099837	total: 9.5s	remaining: 30.2s
71:	learn: 0.0098513	total: 9.59s	remaining: 30s
72:	learn: 0.0095858	total: 9.69s	remaining: 29.7s
73:	learn: 0.0093288	total: 9.79s	remaining: 29.5s
74:	learn: 0.0091712	total: 9.88s	remaining: 29.2s
75:	learn: 0.0088826	total: 9.99s	remaining: 29.1s
76:	learn: 0.0087456	total: 10.1s	remaining: 28.8s
77:	learn: 0.0085813	total: 10.2s	remaining: 28.5s
78:	learn: 0.0083935	total: 10.3s	remaining: 28.4s
79:	learn: 0.0083368	total: 10.4s	remaining: 28.1s
80:	learn: 0.0081994	total: 10.4s	remaining: 27.9s
81:	learn: 0.0080899	total: 10.6s	remaining: 27.7s
82:	learn: 0.0080118	total: 10.6s	remaining: 27.4s
83:	learn: 0.0077842	total: 10.7s	remaining: 27.2s
84:	learn: 0.0075844	total: 10.8s	remaining: 27s
85:	learn: 0.0074271	total: 10.9s	remaining: 26.8s
86:	learn: 0.0073352	total: 11s	remaining: 26.6s
87:	learn: 0.0072306	total: 11.2s	remaining: 26.5s
88:	learn: 0.0071237	total: 11.2s	remaining: 26.3s
89:	learn: 0.0070284	total: 11.3s	remaining: 26.1s
90:	learn: 0.0068639	total: 11.5s	remaining: 25.9s
91:	learn: 0.0067670	total: 11.5s	remaining: 25.7s
92:	learn: 0.0066475	total: 11.6s	remaining: 25.5s
93:	learn: 0.0065406	total: 11.7s	remaining: 25.3s
94:	learn: 0.0064483	total: 11.8s	remaining: 25.2s
95:	learn: 0.0062521	total: 11.9s	remaining: 24.9s
96:	learn: 0.0061511	total: 12s	remaining: 24.8s
97:	learn: 0.0060811	total: 12.1s	remaining: 24.6s
98:	learn: 0.0059476	total: 12.2s	remaining: 24.4s
99:	learn: 0.0058708	total: 12.3s	remaining: 24.3s
100:	learn: 0.0058024	total: 12.4s	remaining: 24.1s
101:	learn: 0.0056609	total: 12.5s	remaining: 23.9s
102:	learn: 0.0055449	total: 12.6s	remaining: 23.8s
103:	learn: 0.0054973	total: 12.7s	remaining: 23.6s
104:	learn: 0.0054106	total: 12.8s	remaining: 23.4s
105:	learn: 0.0052854	total: 12.9s	remaining: 23.3s
106:	learn: 0.0051059	total: 13s	remaining: 23.1s
107:	learn: 0.0050079	total: 13.1s	remaining: 22.9s
108:	learn: 0.0048918	total: 13.2s	remaining: 22.8s
109:	learn: 0.0048318	total: 13.3s	remaining: 22.6s
110:	learn: 0.0047938	total: 13.4s	remaining: 22.4s
111:	learn: 0.0047312	total: 13.5s	remaining: 22.3s
112:	learn: 0.0046954	total: 13.6s	remaining: 22.1s
113:	learn: 0.0046221	total: 13.7s	remaining: 22s
114:	learn: 0.0045732	total: 13.8s	remaining: 21.8s
115:	learn: 0.0045113	total: 13.9s	remaining: 21.6s
116:	learn: 0.0044752	total: 14s	remaining: 21.5s
117:	learn: 0.0044123	total: 14.1s	remaining: 21.3s
118:	learn: 0.0043639	total: 14.1s	remaining: 21.2s
119:	learn: 0.0042751	total: 14.2s	remaining: 21s
120:	learn: 0.0042322	total: 14.3s	remaining: 20.9s
121:	learn: 0.0041778	total: 14.4s	remaining: 20.7s
122:	learn: 0.0041352	total: 14.5s	remaining: 20.6s
123:	learn: 0.0040674	total: 14.6s	remaining: 20.4s
124:	learn: 0.0040506	total: 14.7s	remaining: 20.3s
125:	learn: 0.0040506	total: 14.8s	remaining: 20.1s
126:	learn: 0.0040505	total: 14.9s	remaining: 19.9s
127:	learn: 0.0039885	total: 15s	remaining: 19.8s
128:	learn: 0.0039885	total: 15.1s	remaining: 19.6s
129:	learn: 0.0039885	total: 15.1s	remaining: 19.5s
130:	learn: 0.0039885	total: 15.2s	remaining: 19.3s
131:	learn: 0.0039885	total: 15.3s	remaining: 19.1s
132:	learn: 0.0039408	total: 15.4s	remaining: 19s
133:	learn: 0.0039325	total: 15.5s	remaining: 18.8s
134:	learn: 0.0039325	total: 15.5s	remaining: 18.6s
135:	learn: 0.0038999	total: 15.6s	remaining: 18.5s
136:	learn: 0.0038775	total: 15.7s	remaining: 18.4s
137:	learn: 0.0038103	total: 15.8s	remaining: 18.2s
138:	learn: 0.0037123	total: 15.9s	remaining: 18.1s
139:	learn: 0.0037123	total: 16s	remaining: 17.9s
140:	learn: 0.0037123	total: 16.1s	remaining: 17.8s
141:	learn: 0.0036391	total: 16.2s	remaining: 17.7s
142:	learn: 0.0036391	total: 16.3s	remaining: 17.5s
143:	learn: 0.0036195	total: 16.4s	remaining: 17.4s
144:	learn: 0.0035566	total: 16.5s	remaining: 17.3s
145:	learn: 0.0035432	total: 16.6s	remaining: 17.2s
146:	learn: 0.0035069	total: 16.7s	remaining: 17s
147:	learn: 0.0034629	total: 16.8s	remaining: 16.9s
148:	learn: 0.0034479	total: 16.9s	remaining: 16.7s
149:	learn: 0.0034479	total: 16.9s	remaining: 16.6s
150:	learn: 0.0034479	total: 17s	remaining: 16.4s
151:	learn: 0.0034406	total: 17.1s	remaining: 16.3s
152:	learn: 0.0034406	total: 17.2s	remaining: 16.1s
153:	learn: 0.0033908	total: 17.3s	remaining: 16s
154:	learn: 0.0033469	total: 17.4s	remaining: 15.9s
155:	learn: 0.0033072	total: 17.4s	remaining: 15.8s
156:	learn: 0.0032685	total: 17.6s	remaining: 15.7s
157:	learn: 0.0032685	total: 17.6s	remaining: 15.5s
158:	learn: 0.0032685	total: 17.7s	remaining: 15.4s
159:	learn: 0.0032514	total: 17.8s	remaining: 15.2s
160:	learn: 0.0032414	total: 17.9s	remaining: 15.1s
161:	learn: 0.0032413	total: 17.9s	remaining: 15s
162:	learn: 0.0032414	total: 18s	remaining: 14.8s
163:	learn: 0.0032414	total: 18.1s	remaining: 14.7s
164:	learn: 0.0032414	total: 18.2s	remaining: 14.5s
165:	learn: 0.0032413	total: 18.2s	remaining: 14.4s
166:	learn: 0.0032413	total: 18.3s	remaining: 14.3s
167:	learn: 0.0032413	total: 18.4s	remaining: 14.1s
168:	learn: 0.0032413	total: 18.5s	remaining: 14s
169:	learn: 0.0032413	total: 18.6s	remaining: 13.9s
170:	learn: 0.0032413	total: 18.7s	remaining: 13.8s
171:	learn: 0.0032413	total: 18.8s	remaining: 13.7s
172:	learn: 0.0032413	total: 18.9s	remaining: 13.6s
173:	learn: 0.0032413	total: 19s	remaining: 13.5s
174:	learn: 0.0032413	total: 19.2s	remaining: 13.4s
175:	learn: 0.0032413	total: 19.3s	remaining: 13.2s
176:	learn: 0.0032413	total: 19.4s	remaining: 13.2s
177:	learn: 0.0032413	total: 19.6s	remaining: 13.1s
178:	learn: 0.0032413	total: 19.7s	remaining: 13s
179:	learn: 0.0032413	total: 19.8s	remaining: 12.9s
180:	learn: 0.0032268	total: 20s	remaining: 12.8s
181:	learn: 0.0031802	total: 20.2s	remaining: 12.7s
182:	learn: 0.0031801	total: 20.3s	remaining: 12.6s
183:	learn: 0.0031801	total: 20.4s	remaining: 12.6s
184:	learn: 0.0031801	total: 20.6s	remaining: 12.4s
185:	learn: 0.0031801	total: 20.7s	remaining: 12.3s
186:	learn: 0.0031801	total: 20.8s	remaining: 12.2s
187:	learn: 0.0031801	total: 20.9s	remaining: 12.1s
188:	learn: 0.0031801	total: 21s	remaining: 12s
189:	learn: 0.0031801	total: 21.2s	remaining: 11.9s
190:	learn: 0.0031801	total: 21.3s	remaining: 11.8s
191:	learn: 0.0031801	total: 21.4s	remaining: 11.7s
192:	learn: 0.0031801	total: 21.6s	remaining: 11.6s
193:	learn: 0.0031801	total: 21.7s	remaining: 11.5s
194:	learn: 0.0031801	total: 21.8s	remaining: 11.4s
195:	learn: 0.0031801	total: 22s	remaining: 11.3s
196:	learn: 0.0031801	total: 22.1s	remaining: 11.2s
197:	learn: 0.0031801	total: 22.2s	remaining: 11.1s
198:	learn: 0.0031801	total: 22.4s	remaining: 11s
199:	learn: 0.0031801	total: 22.5s	remaining: 10.9s
200:	learn: 0.0031801	total: 22.6s	remaining: 10.8s
201:	learn: 0.0031801	total: 22.8s	remaining: 10.7s
202:	learn: 0.0031801	total: 22.9s	remaining: 10.6s
203:	learn: 0.0031801	total: 23s	remaining: 10.5s
204:	learn: 0.0031800	total: 23.2s	remaining: 10.4s
205:	learn: 0.0031800	total: 23.3s	remaining: 10.3s
206:	learn: 0.0031800	total: 23.4s	remaining: 10.2s
207:	learn: 0.0031800	total: 23.5s	remaining: 10.1s
208:	learn: 0.0031800	total: 23.7s	remaining: 9.97s
209:	learn: 0.0031800	total: 23.8s	remaining: 9.86s
210:	learn: 0.0031800	total: 23.9s	remaining: 9.76s
211:	learn: 0.0031800	total: 24.1s	remaining: 9.65s
212:	learn: 0.0031800	total: 24.2s	remaining: 9.54s
213:	learn: 0.0031800	total: 24.3s	remaining: 9.44s
214:	learn: 0.0031800	total: 24.4s	remaining: 9.32s
215:	learn: 0.0031800	total: 24.6s	remaining: 9.21s
216:	learn: 0.0031800	total: 24.7s	remaining: 9.1s
217:	learn: 0.0031800	total: 24.7s	remaining: 8.97s
218:	learn: 0.0031800	total: 24.8s	remaining: 8.84s
219:	learn: 0.0031800	total: 24.9s	remaining: 8.72s
220:	learn: 0.0031800	total: 25s	remaining: 8.59s
221:	learn: 0.0031800	total: 25.1s	remaining: 8.46s
222:	learn: 0.0031800	total: 25.2s	remaining: 8.35s
223:	learn: 0.0031800	total: 25.2s	remaining: 8.22s
224:	learn: 0.0031800	total: 25.3s	remaining: 8.1s
225:	learn: 0.0031800	total: 25.4s	remaining: 7.97s
226:	learn: 0.0031800	total: 25.4s	remaining: 7.84s
227:	learn: 0.0031800	total: 25.5s	remaining: 7.71s
228:	learn: 0.0031800	total: 25.6s	remaining: 7.59s
229:	learn: 0.0031800	total: 25.6s	remaining: 7.47s
230:	learn: 0.0031800	total: 25.7s	remaining: 7.34s
231:	learn: 0.0031800	total: 25.8s	remaining: 7.22s
232:	learn: 0.0031800	total: 25.9s	remaining: 7.1s
233:	learn: 0.0031800	total: 25.9s	remaining: 6.98s
234:	learn: 0.0031800	total: 26s	remaining: 6.86s
235:	learn: 0.0031800	total: 26.1s	remaining: 6.74s
236:	learn: 0.0031800	total: 26.1s	remaining: 6.62s
237:	learn: 0.0031799	total: 26.2s	remaining: 6.5s
238:	learn: 0.0031799	total: 26.3s	remaining: 6.38s
239:	learn: 0.0031799	total: 26.4s	remaining: 6.26s
240:	learn: 0.0031799	total: 26.4s	remaining: 6.14s
241:	learn: 0.0031799	total: 26.5s	remaining: 6.02s
242:	learn: 0.0031800	total: 26.6s	remaining: 5.9s
243:	learn: 0.0031799	total: 26.6s	remaining: 5.78s
244:	learn: 0.0031799	total: 26.7s	remaining: 5.67s
245:	learn: 0.0031799	total: 26.8s	remaining: 5.55s
246:	learn: 0.0031799	total: 26.8s	remaining: 5.43s
247:	learn: 0.0031799	total: 26.9s	remaining: 5.32s
248:	learn: 0.0031799	total: 27s	remaining: 5.21s
249:	learn: 0.0031799	total: 27.1s	remaining: 5.09s
250:	learn: 0.0031798	total: 27.2s	remaining: 4.98s
251:	learn: 0.0031798	total: 27.2s	remaining: 4.86s
252:	learn: 0.0031799	total: 27.3s	remaining: 4.75s
253:	learn: 0.0031799	total: 27.4s	remaining: 4.63s
254:	learn: 0.0031798	total: 27.4s	remaining: 4.52s
255:	learn: 0.0031798	total: 27.5s	remaining: 4.41s
256:	learn: 0.0031798	total: 27.6s	remaining: 4.29s
257:	learn: 0.0031798	total: 27.7s	remaining: 4.18s
258:	learn: 0.0031798	total: 27.7s	remaining: 4.07s
259:	learn: 0.0031798	total: 27.8s	remaining: 3.96s
260:	learn: 0.0031798	total: 27.9s	remaining: 3.84s
261:	learn: 0.0031798	total: 27.9s	remaining: 3.73s
262:	learn: 0.0031798	total: 28s	remaining: 3.62s
263:	learn: 0.0031798	total: 28.1s	remaining: 3.51s
264:	learn: 0.0031798	total: 28.2s	remaining: 3.4s
265:	learn: 0.0031798	total: 28.2s	remaining: 3.29s
266:	learn: 0.0031798	total: 28.3s	remaining: 3.18s
267:	learn: 0.0031798	total: 28.4s	remaining: 3.07s
268:	learn: 0.0031798	total: 28.5s	remaining: 2.96s
269:	learn: 0.0031798	total: 28.5s	remaining: 2.85s
270:	learn: 0.0031798	total: 28.6s	remaining: 2.75s
271:	learn: 0.0031798	total: 28.7s	remaining: 2.64s
272:	learn: 0.0031798	total: 28.8s	remaining: 2.53s
273:	learn: 0.0031798	total: 28.8s	remaining: 2.42s
274:	learn: 0.0031797	total: 28.9s	remaining: 2.31s
275:	learn: 0.0031798	total: 29s	remaining: 2.2s
276:	learn: 0.0031798	total: 29.1s	remaining: 2.1s
277:	learn: 0.0031798	total: 29.1s	remaining: 1.99s
278:	learn: 0.0031798	total: 29.2s	remaining: 1.89s
279:	learn: 0.0031798	total: 29.3s	remaining: 1.78s
280:	learn: 0.0031797	total: 29.4s	remaining: 1.67s
281:	learn: 0.0031797	total: 29.4s	remaining: 1.56s
282:	learn: 0.0031797	total: 29.5s	remaining: 1.46s
283:	learn: 0.0031797	total: 29.6s	remaining: 1.35s
284:	learn: 0.0031797	total: 29.6s	remaining: 1.25s
285:	learn: 0.0031797	total: 29.7s	remaining: 1.14s
286:	learn: 0.0031797	total: 29.8s	remaining: 1.04s
287:	learn: 0.0031797	total: 29.9s	remaining: 934ms
288:	learn: 0.0031797	total: 29.9s	remaining: 829ms
289:	learn: 0.0031797	total: 30s	remaining: 724ms
290:	learn: 0.0031797	total: 30.1s	remaining: 620ms
291:	learn: 0.0031797	total: 30.2s	remaining: 517ms
292:	learn: 0.0031797	total: 30.2s	remaining: 413ms
293:	learn: 0.0031797	total: 30.3s	remaining: 309ms
294:	learn: 0.0031796	total: 30.4s	remaining: 206ms
295:	learn: 0.0031796	total: 30.5s	remaining: 103ms
296:	learn: 0.0031796	total: 30.5s	remaining: 0us
[I 2024-12-19 14:52:11,563] Trial 35 finished with value: 79.16809783426771 and parameters: {'learning_rate': 0.098965298814483, 'max_depth': 6, 'n_estimators': 297, 'scale_pos_weight': 5.1636674707054615}. Best is trial 33 with value: 80.40946207404374.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.76
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 31.49
 - Recall_Test: 85.71
 - AUPRC_Test: 78.40
 - Accuracy_Test: 99.66
 - F1-Score_Test: 46.06
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 297
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.16
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 79.1681

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4874039	total: 75.1ms	remaining: 22.3s
1:	learn: 0.3384427	total: 151ms	remaining: 22.4s
2:	learn: 0.2429694	total: 237ms	remaining: 23.3s
3:	learn: 0.1910255	total: 359ms	remaining: 26.4s
4:	learn: 0.1501800	total: 447ms	remaining: 26.2s
5:	learn: 0.1300545	total: 529ms	remaining: 25.8s
6:	learn: 0.1156125	total: 627ms	remaining: 26.1s
7:	learn: 0.1004736	total: 706ms	remaining: 25.6s
8:	learn: 0.0899072	total: 773ms	remaining: 24.8s
9:	learn: 0.0820646	total: 875ms	remaining: 25.2s
10:	learn: 0.0768277	total: 951ms	remaining: 24.8s
11:	learn: 0.0725571	total: 1.03s	remaining: 24.6s
12:	learn: 0.0671700	total: 1.13s	remaining: 24.8s
13:	learn: 0.0622339	total: 1.22s	remaining: 24.7s
14:	learn: 0.0588116	total: 1.29s	remaining: 24.4s
15:	learn: 0.0530476	total: 1.41s	remaining: 24.8s
16:	learn: 0.0501871	total: 1.48s	remaining: 24.5s
17:	learn: 0.0481292	total: 1.57s	remaining: 24.4s
18:	learn: 0.0452720	total: 1.67s	remaining: 24.5s
19:	learn: 0.0435722	total: 1.77s	remaining: 24.6s
20:	learn: 0.0408082	total: 1.9s	remaining: 25.1s
21:	learn: 0.0394275	total: 2.06s	remaining: 25.9s
22:	learn: 0.0375212	total: 2.23s	remaining: 26.7s
23:	learn: 0.0363701	total: 2.41s	remaining: 27.5s
24:	learn: 0.0347479	total: 2.57s	remaining: 28.1s
25:	learn: 0.0335608	total: 2.73s	remaining: 28.6s
26:	learn: 0.0324232	total: 2.9s	remaining: 29.1s
27:	learn: 0.0317833	total: 3.05s	remaining: 29.4s
28:	learn: 0.0306277	total: 3.21s	remaining: 29.8s
29:	learn: 0.0296576	total: 3.37s	remaining: 30.1s
30:	learn: 0.0287603	total: 3.53s	remaining: 30.4s
31:	learn: 0.0278883	total: 3.68s	remaining: 30.6s
32:	learn: 0.0272322	total: 3.88s	remaining: 31.2s
33:	learn: 0.0264927	total: 4.02s	remaining: 31.2s
34:	learn: 0.0257462	total: 4.18s	remaining: 31.4s
35:	learn: 0.0252675	total: 4.33s	remaining: 31.5s
36:	learn: 0.0246892	total: 4.48s	remaining: 31.6s
37:	learn: 0.0240681	total: 4.64s	remaining: 31.7s
38:	learn: 0.0236005	total: 4.79s	remaining: 31.8s
39:	learn: 0.0229133	total: 4.96s	remaining: 32s
40:	learn: 0.0223166	total: 5.13s	remaining: 32.2s
41:	learn: 0.0217125	total: 5.28s	remaining: 32.2s
42:	learn: 0.0212732	total: 5.44s	remaining: 32.3s
43:	learn: 0.0207482	total: 5.59s	remaining: 32.2s
44:	learn: 0.0203598	total: 5.75s	remaining: 32.3s
45:	learn: 0.0199817	total: 5.89s	remaining: 32.3s
46:	learn: 0.0195290	total: 6.07s	remaining: 32.4s
47:	learn: 0.0191437	total: 6.21s	remaining: 32.3s
48:	learn: 0.0187435	total: 6.37s	remaining: 32.4s
49:	learn: 0.0182555	total: 6.5s	remaining: 32.3s
50:	learn: 0.0179127	total: 6.7s	remaining: 32.4s
51:	learn: 0.0175801	total: 6.87s	remaining: 32.5s
52:	learn: 0.0171860	total: 7.03s	remaining: 32.5s
53:	learn: 0.0168826	total: 7.14s	remaining: 32.3s
54:	learn: 0.0165296	total: 7.29s	remaining: 32.2s
55:	learn: 0.0160892	total: 7.46s	remaining: 32.2s
56:	learn: 0.0157697	total: 7.62s	remaining: 32.2s
57:	learn: 0.0154769	total: 7.79s	remaining: 32.2s
58:	learn: 0.0150722	total: 7.96s	remaining: 32.2s
59:	learn: 0.0148704	total: 8.11s	remaining: 32.2s
60:	learn: 0.0145328	total: 8.3s	remaining: 32.2s
61:	learn: 0.0142401	total: 8.46s	remaining: 32.2s
62:	learn: 0.0139725	total: 8.65s	remaining: 32.3s
63:	learn: 0.0136414	total: 8.82s	remaining: 32.3s
64:	learn: 0.0133471	total: 8.99s	remaining: 32.2s
65:	learn: 0.0131663	total: 9.13s	remaining: 32.1s
66:	learn: 0.0129124	total: 9.29s	remaining: 32s
67:	learn: 0.0126784	total: 9.42s	remaining: 31.9s
68:	learn: 0.0124159	total: 9.59s	remaining: 31.8s
69:	learn: 0.0121562	total: 9.74s	remaining: 31.7s
70:	learn: 0.0118457	total: 9.92s	remaining: 31.7s
71:	learn: 0.0116001	total: 10.1s	remaining: 31.7s
72:	learn: 0.0114163	total: 10.3s	remaining: 31.6s
73:	learn: 0.0112006	total: 10.4s	remaining: 31.5s
74:	learn: 0.0110288	total: 10.6s	remaining: 31.4s
75:	learn: 0.0107495	total: 10.7s	remaining: 31.3s
76:	learn: 0.0105615	total: 10.9s	remaining: 31.3s
77:	learn: 0.0103773	total: 11.1s	remaining: 31.2s
78:	learn: 0.0102027	total: 11.2s	remaining: 31.1s
79:	learn: 0.0100173	total: 11.4s	remaining: 31s
80:	learn: 0.0098989	total: 11.5s	remaining: 30.9s
81:	learn: 0.0097110	total: 11.7s	remaining: 30.8s
82:	learn: 0.0096119	total: 11.9s	remaining: 30.7s
83:	learn: 0.0093699	total: 12s	remaining: 30.6s
84:	learn: 0.0092090	total: 12.2s	remaining: 30.5s
85:	learn: 0.0090623	total: 12.3s	remaining: 30.4s
86:	learn: 0.0089157	total: 12.5s	remaining: 30.3s
87:	learn: 0.0088139	total: 12.7s	remaining: 30.2s
88:	learn: 0.0086628	total: 12.8s	remaining: 30.1s
89:	learn: 0.0085548	total: 13s	remaining: 30s
90:	learn: 0.0083944	total: 13.1s	remaining: 29.9s
91:	learn: 0.0082285	total: 13.3s	remaining: 29.8s
92:	learn: 0.0081111	total: 13.4s	remaining: 29.6s
93:	learn: 0.0080205	total: 13.5s	remaining: 29.4s
94:	learn: 0.0078578	total: 13.6s	remaining: 29.1s
95:	learn: 0.0077331	total: 13.8s	remaining: 28.9s
96:	learn: 0.0076278	total: 13.8s	remaining: 28.7s
97:	learn: 0.0075073	total: 13.9s	remaining: 28.4s
98:	learn: 0.0073404	total: 14s	remaining: 28.2s
99:	learn: 0.0071944	total: 14.1s	remaining: 27.9s
100:	learn: 0.0070830	total: 14.2s	remaining: 27.7s
101:	learn: 0.0069756	total: 14.3s	remaining: 27.5s
102:	learn: 0.0069160	total: 14.4s	remaining: 27.2s
103:	learn: 0.0068300	total: 14.4s	remaining: 26.9s
104:	learn: 0.0067546	total: 14.5s	remaining: 26.7s
105:	learn: 0.0066963	total: 14.6s	remaining: 26.5s
106:	learn: 0.0066498	total: 14.7s	remaining: 26.2s
107:	learn: 0.0065918	total: 14.8s	remaining: 26s
108:	learn: 0.0065406	total: 14.9s	remaining: 25.8s
109:	learn: 0.0064847	total: 14.9s	remaining: 25.5s
110:	learn: 0.0064390	total: 15s	remaining: 25.3s
111:	learn: 0.0063347	total: 15.1s	remaining: 25.1s
112:	learn: 0.0062916	total: 15.2s	remaining: 24.9s
113:	learn: 0.0062140	total: 15.3s	remaining: 24.7s
114:	learn: 0.0061441	total: 15.4s	remaining: 24.5s
115:	learn: 0.0060784	total: 15.5s	remaining: 24.3s
116:	learn: 0.0060090	total: 15.6s	remaining: 24.1s
117:	learn: 0.0059827	total: 15.6s	remaining: 23.9s
118:	learn: 0.0059116	total: 15.7s	remaining: 23.7s
119:	learn: 0.0058480	total: 15.8s	remaining: 23.5s
120:	learn: 0.0058004	total: 15.9s	remaining: 23.3s
121:	learn: 0.0057093	total: 16s	remaining: 23.1s
122:	learn: 0.0056460	total: 16.1s	remaining: 22.9s
123:	learn: 0.0055722	total: 16.2s	remaining: 22.7s
124:	learn: 0.0055137	total: 16.3s	remaining: 22.5s
125:	learn: 0.0054759	total: 16.4s	remaining: 22.3s
126:	learn: 0.0054398	total: 16.4s	remaining: 22.1s
127:	learn: 0.0053567	total: 16.5s	remaining: 21.9s
128:	learn: 0.0053040	total: 16.6s	remaining: 21.8s
129:	learn: 0.0052699	total: 16.7s	remaining: 21.5s
130:	learn: 0.0051909	total: 16.8s	remaining: 21.4s
131:	learn: 0.0051020	total: 16.9s	remaining: 21.2s
132:	learn: 0.0050483	total: 16.9s	remaining: 21s
133:	learn: 0.0050054	total: 17s	remaining: 20.8s
134:	learn: 0.0049661	total: 17.1s	remaining: 20.6s
135:	learn: 0.0049270	total: 17.2s	remaining: 20.5s
136:	learn: 0.0048792	total: 17.3s	remaining: 20.3s
137:	learn: 0.0048003	total: 17.4s	remaining: 20.2s
138:	learn: 0.0048003	total: 17.4s	remaining: 20s
139:	learn: 0.0047696	total: 17.5s	remaining: 19.8s
140:	learn: 0.0047432	total: 17.6s	remaining: 19.6s
141:	learn: 0.0046848	total: 17.7s	remaining: 19.4s
142:	learn: 0.0046848	total: 17.8s	remaining: 19.2s
143:	learn: 0.0046336	total: 17.8s	remaining: 19.1s
144:	learn: 0.0045977	total: 17.9s	remaining: 18.9s
145:	learn: 0.0045596	total: 18s	remaining: 18.7s
146:	learn: 0.0045495	total: 18s	remaining: 18.5s
147:	learn: 0.0045355	total: 18.2s	remaining: 18.4s
148:	learn: 0.0044890	total: 18.3s	remaining: 18.3s
149:	learn: 0.0044623	total: 18.4s	remaining: 18.2s
150:	learn: 0.0044622	total: 18.5s	remaining: 18.1s
151:	learn: 0.0044285	total: 18.7s	remaining: 17.9s
152:	learn: 0.0044106	total: 18.8s	remaining: 17.8s
153:	learn: 0.0044106	total: 18.9s	remaining: 17.7s
154:	learn: 0.0044004	total: 19.1s	remaining: 17.6s
155:	learn: 0.0043685	total: 19.2s	remaining: 17.5s
156:	learn: 0.0043488	total: 19.4s	remaining: 17.4s
157:	learn: 0.0042934	total: 19.5s	remaining: 17.3s
158:	learn: 0.0042934	total: 19.7s	remaining: 17.2s
159:	learn: 0.0042783	total: 19.8s	remaining: 17.1s
160:	learn: 0.0042481	total: 19.9s	remaining: 17s
161:	learn: 0.0042092	total: 20.1s	remaining: 16.9s
162:	learn: 0.0041979	total: 20.2s	remaining: 16.7s
163:	learn: 0.0041649	total: 20.4s	remaining: 16.7s
164:	learn: 0.0040901	total: 20.5s	remaining: 16.6s
165:	learn: 0.0040534	total: 20.7s	remaining: 16.5s
166:	learn: 0.0040353	total: 20.9s	remaining: 16.4s
167:	learn: 0.0040059	total: 21s	remaining: 16.3s
168:	learn: 0.0039844	total: 21.1s	remaining: 16.1s
169:	learn: 0.0039359	total: 21.3s	remaining: 16s
170:	learn: 0.0039359	total: 21.4s	remaining: 15.9s
171:	learn: 0.0039359	total: 21.6s	remaining: 15.8s
172:	learn: 0.0039358	total: 21.7s	remaining: 15.7s
173:	learn: 0.0039358	total: 21.8s	remaining: 15.5s
174:	learn: 0.0039358	total: 21.9s	remaining: 15.4s
175:	learn: 0.0039358	total: 22s	remaining: 15.3s
176:	learn: 0.0039358	total: 22.2s	remaining: 15.1s
177:	learn: 0.0039206	total: 22.3s	remaining: 15s
178:	learn: 0.0038903	total: 22.4s	remaining: 14.9s
179:	learn: 0.0038337	total: 22.6s	remaining: 14.8s
180:	learn: 0.0038337	total: 22.7s	remaining: 14.7s
181:	learn: 0.0038336	total: 22.8s	remaining: 14.6s
182:	learn: 0.0038336	total: 22.9s	remaining: 14.4s
183:	learn: 0.0038336	total: 23.1s	remaining: 14.3s
184:	learn: 0.0038336	total: 23.2s	remaining: 14.2s
185:	learn: 0.0038336	total: 23.3s	remaining: 14s
186:	learn: 0.0038336	total: 23.4s	remaining: 13.9s
187:	learn: 0.0038336	total: 23.6s	remaining: 13.8s
188:	learn: 0.0038336	total: 23.7s	remaining: 13.6s
189:	learn: 0.0038213	total: 23.8s	remaining: 13.5s
190:	learn: 0.0037709	total: 24s	remaining: 13.4s
191:	learn: 0.0037708	total: 24.1s	remaining: 13.3s
192:	learn: 0.0037708	total: 24.2s	remaining: 13.2s
193:	learn: 0.0037543	total: 24.4s	remaining: 13.1s
194:	learn: 0.0037543	total: 24.5s	remaining: 12.9s
195:	learn: 0.0037542	total: 24.6s	remaining: 12.8s
196:	learn: 0.0037542	total: 24.7s	remaining: 12.7s
197:	learn: 0.0037542	total: 24.8s	remaining: 12.5s
198:	learn: 0.0037542	total: 24.9s	remaining: 12.4s
199:	learn: 0.0037542	total: 24.9s	remaining: 12.2s
200:	learn: 0.0037542	total: 25s	remaining: 12.1s
201:	learn: 0.0037542	total: 25.1s	remaining: 11.9s
202:	learn: 0.0037542	total: 25.2s	remaining: 11.8s
203:	learn: 0.0037542	total: 25.2s	remaining: 11.6s
204:	learn: 0.0037359	total: 25.3s	remaining: 11.5s
205:	learn: 0.0036923	total: 25.4s	remaining: 11.3s
206:	learn: 0.0036693	total: 25.5s	remaining: 11.2s
207:	learn: 0.0036355	total: 25.5s	remaining: 11.1s
208:	learn: 0.0036158	total: 25.7s	remaining: 10.9s
209:	learn: 0.0036158	total: 25.7s	remaining: 10.8s
210:	learn: 0.0036158	total: 25.8s	remaining: 10.6s
211:	learn: 0.0035637	total: 25.9s	remaining: 10.5s
212:	learn: 0.0035636	total: 25.9s	remaining: 10.4s
213:	learn: 0.0035635	total: 26s	remaining: 10.2s
214:	learn: 0.0035635	total: 26.1s	remaining: 10.1s
215:	learn: 0.0035635	total: 26.2s	remaining: 9.93s
216:	learn: 0.0035495	total: 26.2s	remaining: 9.79s
217:	learn: 0.0035005	total: 26.3s	remaining: 9.67s
218:	learn: 0.0035005	total: 26.4s	remaining: 9.53s
219:	learn: 0.0034780	total: 26.5s	remaining: 9.39s
220:	learn: 0.0034780	total: 26.6s	remaining: 9.25s
221:	learn: 0.0034780	total: 26.7s	remaining: 9.12s
222:	learn: 0.0034780	total: 26.7s	remaining: 8.99s
223:	learn: 0.0034780	total: 26.8s	remaining: 8.86s
224:	learn: 0.0034780	total: 26.9s	remaining: 8.72s
225:	learn: 0.0034780	total: 26.9s	remaining: 8.58s
226:	learn: 0.0034619	total: 27s	remaining: 8.45s
227:	learn: 0.0034619	total: 27.1s	remaining: 8.32s
228:	learn: 0.0034619	total: 27.2s	remaining: 8.18s
229:	learn: 0.0034619	total: 27.2s	remaining: 8.05s
230:	learn: 0.0034619	total: 27.3s	remaining: 7.92s
231:	learn: 0.0034619	total: 27.4s	remaining: 7.79s
232:	learn: 0.0034619	total: 27.4s	remaining: 7.66s
233:	learn: 0.0034619	total: 27.5s	remaining: 7.53s
234:	learn: 0.0034619	total: 27.6s	remaining: 7.4s
235:	learn: 0.0034619	total: 27.7s	remaining: 7.27s
236:	learn: 0.0034619	total: 27.8s	remaining: 7.15s
237:	learn: 0.0034619	total: 27.8s	remaining: 7.02s
238:	learn: 0.0034619	total: 27.9s	remaining: 6.89s
239:	learn: 0.0034619	total: 28s	remaining: 6.76s
240:	learn: 0.0034619	total: 28s	remaining: 6.63s
241:	learn: 0.0034619	total: 28.1s	remaining: 6.51s
242:	learn: 0.0034619	total: 28.2s	remaining: 6.39s
243:	learn: 0.0034619	total: 28.3s	remaining: 6.26s
244:	learn: 0.0034619	total: 28.4s	remaining: 6.14s
245:	learn: 0.0034619	total: 28.4s	remaining: 6.01s
246:	learn: 0.0034619	total: 28.5s	remaining: 5.89s
247:	learn: 0.0034619	total: 28.6s	remaining: 5.76s
248:	learn: 0.0034619	total: 28.7s	remaining: 5.64s
249:	learn: 0.0034619	total: 28.7s	remaining: 5.52s
250:	learn: 0.0034619	total: 28.8s	remaining: 5.39s
251:	learn: 0.0034619	total: 28.9s	remaining: 5.28s
252:	learn: 0.0034619	total: 29s	remaining: 5.15s
253:	learn: 0.0034619	total: 29s	remaining: 5.03s
254:	learn: 0.0034619	total: 29.1s	remaining: 4.91s
255:	learn: 0.0034619	total: 29.2s	remaining: 4.79s
256:	learn: 0.0034619	total: 29.3s	remaining: 4.67s
257:	learn: 0.0034619	total: 29.3s	remaining: 4.54s
258:	learn: 0.0034619	total: 29.4s	remaining: 4.42s
259:	learn: 0.0034619	total: 29.5s	remaining: 4.31s
260:	learn: 0.0034619	total: 29.6s	remaining: 4.19s
261:	learn: 0.0034619	total: 29.6s	remaining: 4.07s
262:	learn: 0.0034619	total: 29.7s	remaining: 3.96s
263:	learn: 0.0034619	total: 29.8s	remaining: 3.84s
264:	learn: 0.0034619	total: 29.9s	remaining: 3.72s
265:	learn: 0.0034619	total: 30s	remaining: 3.6s
266:	learn: 0.0034619	total: 30s	remaining: 3.49s
267:	learn: 0.0034253	total: 30.1s	remaining: 3.37s
268:	learn: 0.0034184	total: 30.2s	remaining: 3.26s
269:	learn: 0.0034184	total: 30.3s	remaining: 3.14s
270:	learn: 0.0033932	total: 30.3s	remaining: 3.02s
271:	learn: 0.0033932	total: 30.4s	remaining: 2.91s
272:	learn: 0.0033932	total: 30.5s	remaining: 2.79s
273:	learn: 0.0033932	total: 30.5s	remaining: 2.67s
274:	learn: 0.0033932	total: 30.6s	remaining: 2.56s
275:	learn: 0.0033505	total: 30.7s	remaining: 2.44s
276:	learn: 0.0033505	total: 30.8s	remaining: 2.33s
277:	learn: 0.0033505	total: 30.8s	remaining: 2.22s
278:	learn: 0.0033505	total: 30.9s	remaining: 2.1s
279:	learn: 0.0033505	total: 31s	remaining: 1.99s
280:	learn: 0.0033505	total: 31s	remaining: 1.88s
281:	learn: 0.0033505	total: 31.1s	remaining: 1.76s
282:	learn: 0.0033505	total: 31.2s	remaining: 1.65s
283:	learn: 0.0033505	total: 31.2s	remaining: 1.54s
284:	learn: 0.0033505	total: 31.3s	remaining: 1.43s
285:	learn: 0.0033505	total: 31.4s	remaining: 1.31s
286:	learn: 0.0033505	total: 31.4s	remaining: 1.2s
287:	learn: 0.0033505	total: 31.5s	remaining: 1.09s
288:	learn: 0.0033505	total: 31.5s	remaining: 982ms
289:	learn: 0.0033505	total: 31.6s	remaining: 872ms
290:	learn: 0.0033505	total: 31.7s	remaining: 762ms
291:	learn: 0.0033505	total: 31.7s	remaining: 652ms
292:	learn: 0.0033505	total: 31.8s	remaining: 543ms
293:	learn: 0.0033439	total: 31.9s	remaining: 434ms
294:	learn: 0.0033439	total: 32s	remaining: 325ms
295:	learn: 0.0033439	total: 32s	remaining: 216ms
296:	learn: 0.0033439	total: 32.1s	remaining: 108ms
297:	learn: 0.0033439	total: 32.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.72
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 26.85
 - Recall_Test: 83.33
 - AUPRC_Test: 78.45
 - Accuracy_Test: 99.59
 - F1-Score_Test: 40.62
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 298
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4910591	total: 84.3ms	remaining: 25s
1:	learn: 0.3617510	total: 160ms	remaining: 23.7s
2:	learn: 0.2733040	total: 239ms	remaining: 23.5s
3:	learn: 0.2104339	total: 388ms	remaining: 28.5s
4:	learn: 0.1765844	total: 510ms	remaining: 29.9s
5:	learn: 0.1472855	total: 688ms	remaining: 33.5s
6:	learn: 0.1329715	total: 829ms	remaining: 34.5s
7:	learn: 0.1215824	total: 991ms	remaining: 35.9s
8:	learn: 0.1125742	total: 1.16s	remaining: 37.1s
9:	learn: 0.1042066	total: 1.32s	remaining: 38.1s
10:	learn: 0.0961468	total: 1.48s	remaining: 38.7s
11:	learn: 0.0910433	total: 1.66s	remaining: 39.6s
12:	learn: 0.0864008	total: 1.83s	remaining: 40s
13:	learn: 0.0817339	total: 1.99s	remaining: 40.3s
14:	learn: 0.0774240	total: 2.14s	remaining: 40.3s
15:	learn: 0.0748900	total: 2.31s	remaining: 40.7s
16:	learn: 0.0715918	total: 2.47s	remaining: 40.9s
17:	learn: 0.0690200	total: 2.62s	remaining: 40.8s
18:	learn: 0.0664064	total: 2.78s	remaining: 40.8s
19:	learn: 0.0645824	total: 2.94s	remaining: 40.9s
20:	learn: 0.0632752	total: 3.08s	remaining: 40.6s
21:	learn: 0.0602410	total: 3.25s	remaining: 40.8s
22:	learn: 0.0581826	total: 3.42s	remaining: 40.9s
23:	learn: 0.0556004	total: 3.58s	remaining: 40.9s
24:	learn: 0.0543722	total: 3.73s	remaining: 40.7s
25:	learn: 0.0531309	total: 3.89s	remaining: 40.7s
26:	learn: 0.0518407	total: 4.05s	remaining: 40.7s
27:	learn: 0.0497015	total: 4.24s	remaining: 40.9s
28:	learn: 0.0480915	total: 4.39s	remaining: 40.7s
29:	learn: 0.0470747	total: 4.55s	remaining: 40.7s
30:	learn: 0.0459494	total: 4.7s	remaining: 40.5s
31:	learn: 0.0442935	total: 4.87s	remaining: 40.5s
32:	learn: 0.0427910	total: 5.03s	remaining: 40.4s
33:	learn: 0.0418308	total: 5.2s	remaining: 40.4s
34:	learn: 0.0402831	total: 5.37s	remaining: 40.3s
35:	learn: 0.0391833	total: 5.51s	remaining: 40.1s
36:	learn: 0.0377377	total: 5.67s	remaining: 40s
37:	learn: 0.0367524	total: 5.85s	remaining: 40s
38:	learn: 0.0361295	total: 5.99s	remaining: 39.8s
39:	learn: 0.0352341	total: 6.09s	remaining: 39.3s
40:	learn: 0.0344445	total: 6.17s	remaining: 38.7s
41:	learn: 0.0336105	total: 6.25s	remaining: 38.1s
42:	learn: 0.0328687	total: 6.35s	remaining: 37.7s
43:	learn: 0.0320844	total: 6.44s	remaining: 37.2s
44:	learn: 0.0314097	total: 6.51s	remaining: 36.6s
45:	learn: 0.0305844	total: 6.61s	remaining: 36.2s
46:	learn: 0.0299619	total: 6.69s	remaining: 35.7s
47:	learn: 0.0294423	total: 6.76s	remaining: 35.2s
48:	learn: 0.0288958	total: 6.87s	remaining: 34.9s
49:	learn: 0.0282794	total: 6.96s	remaining: 34.5s
50:	learn: 0.0275400	total: 7.04s	remaining: 34.1s
51:	learn: 0.0269923	total: 7.17s	remaining: 33.9s
52:	learn: 0.0265788	total: 7.26s	remaining: 33.6s
53:	learn: 0.0260455	total: 7.34s	remaining: 33.2s
54:	learn: 0.0255745	total: 7.44s	remaining: 32.9s
55:	learn: 0.0248357	total: 7.52s	remaining: 32.5s
56:	learn: 0.0243159	total: 7.6s	remaining: 32.1s
57:	learn: 0.0237717	total: 7.7s	remaining: 31.9s
58:	learn: 0.0233661	total: 7.78s	remaining: 31.5s
59:	learn: 0.0229291	total: 7.86s	remaining: 31.2s
60:	learn: 0.0224675	total: 7.96s	remaining: 30.9s
61:	learn: 0.0220874	total: 8.06s	remaining: 30.7s
62:	learn: 0.0215347	total: 8.14s	remaining: 30.4s
63:	learn: 0.0213073	total: 8.26s	remaining: 30.2s
64:	learn: 0.0208401	total: 8.34s	remaining: 29.9s
65:	learn: 0.0202848	total: 8.41s	remaining: 29.6s
66:	learn: 0.0199211	total: 8.51s	remaining: 29.3s
67:	learn: 0.0195301	total: 8.59s	remaining: 29.1s
68:	learn: 0.0192877	total: 8.67s	remaining: 28.8s
69:	learn: 0.0190253	total: 8.77s	remaining: 28.6s
70:	learn: 0.0186947	total: 8.84s	remaining: 28.3s
71:	learn: 0.0182282	total: 8.92s	remaining: 28s
72:	learn: 0.0178417	total: 9.04s	remaining: 27.9s
73:	learn: 0.0176048	total: 9.12s	remaining: 27.6s
74:	learn: 0.0173532	total: 9.22s	remaining: 27.4s
75:	learn: 0.0169619	total: 9.34s	remaining: 27.3s
76:	learn: 0.0168210	total: 9.41s	remaining: 27s
77:	learn: 0.0165587	total: 9.48s	remaining: 26.8s
78:	learn: 0.0162061	total: 9.59s	remaining: 26.6s
79:	learn: 0.0159989	total: 9.67s	remaining: 26.4s
80:	learn: 0.0157736	total: 9.74s	remaining: 26.1s
81:	learn: 0.0155185	total: 9.85s	remaining: 25.9s
82:	learn: 0.0152843	total: 9.92s	remaining: 25.7s
83:	learn: 0.0147932	total: 10s	remaining: 25.5s
84:	learn: 0.0146406	total: 10.1s	remaining: 25.3s
85:	learn: 0.0143280	total: 10.2s	remaining: 25.1s
86:	learn: 0.0139853	total: 10.3s	remaining: 24.9s
87:	learn: 0.0137304	total: 10.4s	remaining: 24.8s
88:	learn: 0.0135846	total: 10.5s	remaining: 24.6s
89:	learn: 0.0133760	total: 10.5s	remaining: 24.4s
90:	learn: 0.0133487	total: 10.6s	remaining: 24.2s
91:	learn: 0.0132267	total: 10.7s	remaining: 24s
92:	learn: 0.0130274	total: 10.8s	remaining: 23.8s
93:	learn: 0.0128221	total: 10.9s	remaining: 23.7s
94:	learn: 0.0125252	total: 11s	remaining: 23.5s
95:	learn: 0.0124282	total: 11.1s	remaining: 23.3s
96:	learn: 0.0122506	total: 11.2s	remaining: 23.2s
97:	learn: 0.0120347	total: 11.3s	remaining: 23s
98:	learn: 0.0118768	total: 11.3s	remaining: 22.8s
99:	learn: 0.0117070	total: 11.4s	remaining: 22.7s
100:	learn: 0.0114034	total: 11.5s	remaining: 22.5s
101:	learn: 0.0112351	total: 11.6s	remaining: 22.3s
102:	learn: 0.0110947	total: 11.7s	remaining: 22.2s
103:	learn: 0.0109891	total: 11.8s	remaining: 22s
104:	learn: 0.0108707	total: 11.9s	remaining: 21.8s
105:	learn: 0.0106888	total: 12s	remaining: 21.7s
106:	learn: 0.0105255	total: 12.1s	remaining: 21.5s
107:	learn: 0.0104050	total: 12.1s	remaining: 21.4s
108:	learn: 0.0102698	total: 12.3s	remaining: 21.3s
109:	learn: 0.0101862	total: 12.3s	remaining: 21.1s
110:	learn: 0.0100452	total: 12.4s	remaining: 20.9s
111:	learn: 0.0099315	total: 12.5s	remaining: 20.8s
112:	learn: 0.0097737	total: 12.6s	remaining: 20.6s
113:	learn: 0.0096954	total: 12.7s	remaining: 20.5s
114:	learn: 0.0096178	total: 12.8s	remaining: 20.3s
115:	learn: 0.0094369	total: 12.9s	remaining: 20.2s
116:	learn: 0.0093023	total: 12.9s	remaining: 20s
117:	learn: 0.0092082	total: 13s	remaining: 19.9s
118:	learn: 0.0091128	total: 13.1s	remaining: 19.7s
119:	learn: 0.0090697	total: 13.2s	remaining: 19.6s
120:	learn: 0.0088730	total: 13.3s	remaining: 19.5s
121:	learn: 0.0086876	total: 13.4s	remaining: 19.3s
122:	learn: 0.0086378	total: 13.5s	remaining: 19.2s
123:	learn: 0.0085493	total: 13.6s	remaining: 19s
124:	learn: 0.0084244	total: 13.7s	remaining: 18.9s
125:	learn: 0.0082558	total: 13.7s	remaining: 18.7s
126:	learn: 0.0081436	total: 13.8s	remaining: 18.6s
127:	learn: 0.0080800	total: 13.9s	remaining: 18.5s
128:	learn: 0.0079636	total: 14s	remaining: 18.3s
129:	learn: 0.0078794	total: 14.1s	remaining: 18.2s
130:	learn: 0.0078337	total: 14.2s	remaining: 18.1s
131:	learn: 0.0077092	total: 14.3s	remaining: 18s
132:	learn: 0.0076133	total: 14.4s	remaining: 17.8s
133:	learn: 0.0075470	total: 14.5s	remaining: 17.7s
134:	learn: 0.0074506	total: 14.5s	remaining: 17.6s
135:	learn: 0.0073833	total: 14.6s	remaining: 17.4s
136:	learn: 0.0073599	total: 14.7s	remaining: 17.3s
137:	learn: 0.0073053	total: 14.8s	remaining: 17.1s
138:	learn: 0.0072386	total: 14.9s	remaining: 17s
139:	learn: 0.0071976	total: 15s	remaining: 16.9s
140:	learn: 0.0071092	total: 15s	remaining: 16.7s
141:	learn: 0.0070606	total: 15.1s	remaining: 16.6s
142:	learn: 0.0069578	total: 15.2s	remaining: 16.5s
143:	learn: 0.0068058	total: 15.3s	remaining: 16.4s
144:	learn: 0.0067763	total: 15.4s	remaining: 16.2s
145:	learn: 0.0067392	total: 15.5s	remaining: 16.1s
146:	learn: 0.0066494	total: 15.5s	remaining: 16s
147:	learn: 0.0066308	total: 15.6s	remaining: 15.8s
148:	learn: 0.0065108	total: 15.7s	remaining: 15.7s
149:	learn: 0.0064892	total: 15.8s	remaining: 15.6s
150:	learn: 0.0064321	total: 15.9s	remaining: 15.5s
151:	learn: 0.0063896	total: 16s	remaining: 15.3s
152:	learn: 0.0063314	total: 16.1s	remaining: 15.3s
153:	learn: 0.0062754	total: 16.3s	remaining: 15.2s
154:	learn: 0.0062269	total: 16.4s	remaining: 15.2s
155:	learn: 0.0061689	total: 16.6s	remaining: 15.1s
156:	learn: 0.0061359	total: 16.7s	remaining: 15s
157:	learn: 0.0060704	total: 16.9s	remaining: 15s
158:	learn: 0.0060064	total: 17s	remaining: 14.9s
159:	learn: 0.0059354	total: 17.2s	remaining: 14.8s
160:	learn: 0.0058512	total: 17.4s	remaining: 14.8s
161:	learn: 0.0057562	total: 17.5s	remaining: 14.7s
162:	learn: 0.0057290	total: 17.7s	remaining: 14.6s
163:	learn: 0.0056567	total: 17.8s	remaining: 14.6s
164:	learn: 0.0056224	total: 18s	remaining: 14.5s
165:	learn: 0.0055487	total: 18.1s	remaining: 14.4s
166:	learn: 0.0055295	total: 18.2s	remaining: 14.3s
167:	learn: 0.0054895	total: 18.4s	remaining: 14.2s
168:	learn: 0.0054125	total: 18.5s	remaining: 14.1s
169:	learn: 0.0053563	total: 18.7s	remaining: 14.1s
170:	learn: 0.0053199	total: 18.8s	remaining: 14s
171:	learn: 0.0053025	total: 19s	remaining: 13.9s
172:	learn: 0.0052086	total: 19.1s	remaining: 13.8s
173:	learn: 0.0051945	total: 19.3s	remaining: 13.7s
174:	learn: 0.0051757	total: 19.4s	remaining: 13.6s
175:	learn: 0.0051210	total: 19.6s	remaining: 13.6s
176:	learn: 0.0050641	total: 19.7s	remaining: 13.5s
177:	learn: 0.0049955	total: 19.9s	remaining: 13.4s
178:	learn: 0.0049419	total: 20s	remaining: 13.3s
179:	learn: 0.0049063	total: 20.2s	remaining: 13.2s
180:	learn: 0.0049062	total: 20.3s	remaining: 13.1s
181:	learn: 0.0048855	total: 20.5s	remaining: 13s
182:	learn: 0.0048252	total: 20.6s	remaining: 13s
183:	learn: 0.0047738	total: 20.8s	remaining: 12.9s
184:	learn: 0.0047094	total: 21s	remaining: 12.8s
185:	learn: 0.0046608	total: 21.1s	remaining: 12.7s
186:	learn: 0.0045914	total: 21.3s	remaining: 12.6s
187:	learn: 0.0045914	total: 21.4s	remaining: 12.5s
188:	learn: 0.0045742	total: 21.6s	remaining: 12.4s
189:	learn: 0.0045514	total: 21.7s	remaining: 12.3s
190:	learn: 0.0045154	total: 21.9s	remaining: 12.3s
191:	learn: 0.0044910	total: 22s	remaining: 12.1s
192:	learn: 0.0044597	total: 22.1s	remaining: 12s
193:	learn: 0.0044424	total: 22.2s	remaining: 11.9s
194:	learn: 0.0044263	total: 22.2s	remaining: 11.7s
195:	learn: 0.0043688	total: 22.3s	remaining: 11.6s
196:	learn: 0.0043495	total: 22.4s	remaining: 11.5s
197:	learn: 0.0043495	total: 22.4s	remaining: 11.3s
198:	learn: 0.0043495	total: 22.5s	remaining: 11.2s
199:	learn: 0.0043495	total: 22.6s	remaining: 11.1s
200:	learn: 0.0043495	total: 22.7s	remaining: 10.9s
201:	learn: 0.0043495	total: 22.7s	remaining: 10.8s
202:	learn: 0.0043495	total: 22.8s	remaining: 10.7s
203:	learn: 0.0043495	total: 22.8s	remaining: 10.5s
204:	learn: 0.0043495	total: 22.9s	remaining: 10.4s
205:	learn: 0.0043495	total: 23s	remaining: 10.3s
206:	learn: 0.0043495	total: 23s	remaining: 10.1s
207:	learn: 0.0043495	total: 23.1s	remaining: 9.99s
208:	learn: 0.0043495	total: 23.2s	remaining: 9.87s
209:	learn: 0.0043495	total: 23.2s	remaining: 9.74s
210:	learn: 0.0043495	total: 23.3s	remaining: 9.6s
211:	learn: 0.0043495	total: 23.4s	remaining: 9.47s
212:	learn: 0.0043495	total: 23.4s	remaining: 9.35s
213:	learn: 0.0043495	total: 23.5s	remaining: 9.22s
214:	learn: 0.0043495	total: 23.6s	remaining: 9.09s
215:	learn: 0.0043495	total: 23.6s	remaining: 8.96s
216:	learn: 0.0043495	total: 23.7s	remaining: 8.85s
217:	learn: 0.0043495	total: 23.8s	remaining: 8.72s
218:	learn: 0.0043495	total: 23.8s	remaining: 8.59s
219:	learn: 0.0043495	total: 23.9s	remaining: 8.47s
220:	learn: 0.0043495	total: 24s	remaining: 8.36s
221:	learn: 0.0043495	total: 24.1s	remaining: 8.23s
222:	learn: 0.0043495	total: 24.1s	remaining: 8.11s
223:	learn: 0.0043495	total: 24.2s	remaining: 7.99s
224:	learn: 0.0043495	total: 24.3s	remaining: 7.87s
225:	learn: 0.0043495	total: 24.3s	remaining: 7.75s
226:	learn: 0.0043495	total: 24.4s	remaining: 7.63s
227:	learn: 0.0043495	total: 24.5s	remaining: 7.51s
228:	learn: 0.0043495	total: 24.5s	remaining: 7.39s
229:	learn: 0.0043495	total: 24.6s	remaining: 7.27s
230:	learn: 0.0043495	total: 24.6s	remaining: 7.15s
231:	learn: 0.0043495	total: 24.7s	remaining: 7.04s
232:	learn: 0.0043495	total: 24.8s	remaining: 6.92s
233:	learn: 0.0043495	total: 24.9s	remaining: 6.8s
234:	learn: 0.0043495	total: 24.9s	remaining: 6.68s
235:	learn: 0.0043495	total: 25s	remaining: 6.57s
236:	learn: 0.0043495	total: 25.1s	remaining: 6.45s
237:	learn: 0.0043495	total: 25.1s	remaining: 6.33s
238:	learn: 0.0043495	total: 25.2s	remaining: 6.22s
239:	learn: 0.0043495	total: 25.3s	remaining: 6.11s
240:	learn: 0.0043495	total: 25.3s	remaining: 5.99s
241:	learn: 0.0043495	total: 25.4s	remaining: 5.88s
242:	learn: 0.0043495	total: 25.4s	remaining: 5.76s
243:	learn: 0.0043495	total: 25.5s	remaining: 5.65s
244:	learn: 0.0043495	total: 25.6s	remaining: 5.54s
245:	learn: 0.0043495	total: 25.7s	remaining: 5.42s
246:	learn: 0.0043495	total: 25.7s	remaining: 5.31s
247:	learn: 0.0043495	total: 25.8s	remaining: 5.2s
248:	learn: 0.0043495	total: 25.9s	remaining: 5.09s
249:	learn: 0.0043495	total: 25.9s	remaining: 4.98s
250:	learn: 0.0043495	total: 26s	remaining: 4.87s
251:	learn: 0.0043495	total: 26.1s	remaining: 4.76s
252:	learn: 0.0043495	total: 26.1s	remaining: 4.65s
253:	learn: 0.0043495	total: 26.2s	remaining: 4.54s
254:	learn: 0.0043495	total: 26.3s	remaining: 4.43s
255:	learn: 0.0043495	total: 26.3s	remaining: 4.32s
256:	learn: 0.0043495	total: 26.4s	remaining: 4.21s
257:	learn: 0.0043495	total: 26.5s	remaining: 4.1s
258:	learn: 0.0043495	total: 26.5s	remaining: 3.99s
259:	learn: 0.0043495	total: 26.6s	remaining: 3.89s
260:	learn: 0.0043495	total: 26.7s	remaining: 3.78s
261:	learn: 0.0043495	total: 26.7s	remaining: 3.67s
262:	learn: 0.0043495	total: 26.8s	remaining: 3.56s
263:	learn: 0.0043495	total: 26.9s	remaining: 3.46s
264:	learn: 0.0043495	total: 27s	remaining: 3.36s
265:	learn: 0.0043495	total: 27s	remaining: 3.25s
266:	learn: 0.0043495	total: 27.1s	remaining: 3.14s
267:	learn: 0.0043495	total: 27.2s	remaining: 3.04s
268:	learn: 0.0043495	total: 27.2s	remaining: 2.93s
269:	learn: 0.0043495	total: 27.3s	remaining: 2.83s
270:	learn: 0.0043495	total: 27.3s	remaining: 2.72s
271:	learn: 0.0043495	total: 27.4s	remaining: 2.62s
272:	learn: 0.0043495	total: 27.5s	remaining: 2.52s
273:	learn: 0.0043495	total: 27.5s	remaining: 2.41s
274:	learn: 0.0043495	total: 27.6s	remaining: 2.31s
275:	learn: 0.0043495	total: 27.7s	remaining: 2.21s
276:	learn: 0.0043495	total: 27.7s	remaining: 2.1s
277:	learn: 0.0043495	total: 27.8s	remaining: 2s
278:	learn: 0.0043495	total: 27.9s	remaining: 1.9s
279:	learn: 0.0043495	total: 28s	remaining: 1.8s
280:	learn: 0.0043495	total: 28s	remaining: 1.69s
281:	learn: 0.0043495	total: 28.1s	remaining: 1.59s
282:	learn: 0.0043495	total: 28.1s	remaining: 1.49s
283:	learn: 0.0043495	total: 28.2s	remaining: 1.39s
284:	learn: 0.0043495	total: 28.3s	remaining: 1.29s
285:	learn: 0.0043495	total: 28.3s	remaining: 1.19s
286:	learn: 0.0043495	total: 28.4s	remaining: 1.09s
287:	learn: 0.0043495	total: 28.5s	remaining: 988ms
288:	learn: 0.0043495	total: 28.5s	remaining: 888ms
289:	learn: 0.0043495	total: 28.6s	remaining: 789ms
290:	learn: 0.0043495	total: 28.6s	remaining: 689ms
291:	learn: 0.0043495	total: 28.7s	remaining: 590ms
292:	learn: 0.0043495	total: 28.8s	remaining: 491ms
293:	learn: 0.0043495	total: 28.8s	remaining: 392ms
294:	learn: 0.0043495	total: 28.9s	remaining: 294ms
295:	learn: 0.0043495	total: 29s	remaining: 196ms
296:	learn: 0.0043495	total: 29.1s	remaining: 97.8ms
297:	learn: 0.0043495	total: 29.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.64
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.82
 - Precision_Test: 29.17
 - Recall_Test: 88.89
 - AUPRC_Test: 77.80
 - Accuracy_Test: 99.62
 - F1-Score_Test: 43.92
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 298
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4828445	total: 75.7ms	remaining: 22.5s
1:	learn: 0.3480329	total: 150ms	remaining: 22.1s
2:	learn: 0.2444014	total: 231ms	remaining: 22.7s
3:	learn: 0.2018446	total: 327ms	remaining: 24s
4:	learn: 0.1677481	total: 404ms	remaining: 23.7s
5:	learn: 0.1456600	total: 479ms	remaining: 23.3s
6:	learn: 0.1298564	total: 581ms	remaining: 24.2s
7:	learn: 0.1166508	total: 659ms	remaining: 23.9s
8:	learn: 0.1046817	total: 801ms	remaining: 25.7s
9:	learn: 0.0957661	total: 951ms	remaining: 27.4s
10:	learn: 0.0908109	total: 1.08s	remaining: 28.2s
11:	learn: 0.0854556	total: 1.23s	remaining: 29.3s
12:	learn: 0.0804079	total: 1.36s	remaining: 29.9s
13:	learn: 0.0771376	total: 1.52s	remaining: 30.8s
14:	learn: 0.0738359	total: 1.68s	remaining: 31.7s
15:	learn: 0.0696445	total: 1.84s	remaining: 32.5s
16:	learn: 0.0665991	total: 1.99s	remaining: 33s
17:	learn: 0.0623966	total: 2.15s	remaining: 33.4s
18:	learn: 0.0598938	total: 2.32s	remaining: 34s
19:	learn: 0.0573016	total: 2.48s	remaining: 34.4s
20:	learn: 0.0551979	total: 2.64s	remaining: 34.8s
21:	learn: 0.0532967	total: 2.77s	remaining: 34.7s
22:	learn: 0.0519187	total: 2.92s	remaining: 35s
23:	learn: 0.0489089	total: 3.07s	remaining: 35.1s
24:	learn: 0.0477721	total: 3.22s	remaining: 35.2s
25:	learn: 0.0465196	total: 3.36s	remaining: 35.2s
26:	learn: 0.0441964	total: 3.54s	remaining: 35.6s
27:	learn: 0.0427016	total: 3.65s	remaining: 35.2s
28:	learn: 0.0413092	total: 3.81s	remaining: 35.3s
29:	learn: 0.0400198	total: 3.98s	remaining: 35.6s
30:	learn: 0.0384432	total: 4.14s	remaining: 35.6s
31:	learn: 0.0372164	total: 4.31s	remaining: 35.8s
32:	learn: 0.0362518	total: 4.47s	remaining: 35.9s
33:	learn: 0.0356406	total: 4.6s	remaining: 35.7s
34:	learn: 0.0346612	total: 4.75s	remaining: 35.7s
35:	learn: 0.0334537	total: 4.9s	remaining: 35.7s
36:	learn: 0.0323757	total: 5.05s	remaining: 35.7s
37:	learn: 0.0313043	total: 5.22s	remaining: 35.8s
38:	learn: 0.0305251	total: 5.39s	remaining: 35.8s
39:	learn: 0.0297221	total: 5.52s	remaining: 35.6s
40:	learn: 0.0289637	total: 5.69s	remaining: 35.7s
41:	learn: 0.0284195	total: 5.83s	remaining: 35.5s
42:	learn: 0.0280638	total: 5.98s	remaining: 35.5s
43:	learn: 0.0275145	total: 6.13s	remaining: 35.4s
44:	learn: 0.0270003	total: 6.26s	remaining: 35.2s
45:	learn: 0.0259704	total: 6.34s	remaining: 34.8s
46:	learn: 0.0252640	total: 6.42s	remaining: 34.3s
47:	learn: 0.0245688	total: 6.52s	remaining: 34s
48:	learn: 0.0239811	total: 6.6s	remaining: 33.6s
49:	learn: 0.0235714	total: 6.69s	remaining: 33.2s
50:	learn: 0.0232792	total: 6.78s	remaining: 32.8s
51:	learn: 0.0228245	total: 6.86s	remaining: 32.4s
52:	learn: 0.0224452	total: 6.95s	remaining: 32.1s
53:	learn: 0.0221379	total: 7.04s	remaining: 31.8s
54:	learn: 0.0216679	total: 7.12s	remaining: 31.5s
55:	learn: 0.0211787	total: 7.22s	remaining: 31.2s
56:	learn: 0.0206523	total: 7.32s	remaining: 31s
57:	learn: 0.0202527	total: 7.39s	remaining: 30.6s
58:	learn: 0.0197710	total: 7.48s	remaining: 30.3s
59:	learn: 0.0193814	total: 7.58s	remaining: 30.1s
60:	learn: 0.0188580	total: 7.66s	remaining: 29.8s
61:	learn: 0.0184391	total: 7.75s	remaining: 29.5s
62:	learn: 0.0180763	total: 7.85s	remaining: 29.3s
63:	learn: 0.0179022	total: 7.96s	remaining: 29.1s
64:	learn: 0.0175044	total: 8.04s	remaining: 28.8s
65:	learn: 0.0172409	total: 8.14s	remaining: 28.6s
66:	learn: 0.0170216	total: 8.23s	remaining: 28.4s
67:	learn: 0.0165569	total: 8.32s	remaining: 28.1s
68:	learn: 0.0163013	total: 8.42s	remaining: 27.9s
69:	learn: 0.0160368	total: 8.5s	remaining: 27.7s
70:	learn: 0.0157385	total: 8.58s	remaining: 27.4s
71:	learn: 0.0154786	total: 8.69s	remaining: 27.3s
72:	learn: 0.0151660	total: 8.76s	remaining: 27s
73:	learn: 0.0149029	total: 8.84s	remaining: 26.8s
74:	learn: 0.0147403	total: 8.97s	remaining: 26.7s
75:	learn: 0.0145393	total: 9.06s	remaining: 26.5s
76:	learn: 0.0142766	total: 9.14s	remaining: 26.2s
77:	learn: 0.0140089	total: 9.26s	remaining: 26.1s
78:	learn: 0.0137989	total: 9.33s	remaining: 25.9s
79:	learn: 0.0133858	total: 9.42s	remaining: 25.7s
80:	learn: 0.0132129	total: 9.51s	remaining: 25.5s
81:	learn: 0.0129120	total: 9.6s	remaining: 25.3s
82:	learn: 0.0126704	total: 9.68s	remaining: 25.1s
83:	learn: 0.0124931	total: 9.78s	remaining: 24.9s
84:	learn: 0.0123371	total: 9.86s	remaining: 24.7s
85:	learn: 0.0121051	total: 9.95s	remaining: 24.5s
86:	learn: 0.0119835	total: 10.1s	remaining: 24.4s
87:	learn: 0.0118027	total: 10.1s	remaining: 24.2s
88:	learn: 0.0116994	total: 10.2s	remaining: 24s
89:	learn: 0.0115189	total: 10.3s	remaining: 23.9s
90:	learn: 0.0112591	total: 10.4s	remaining: 23.7s
91:	learn: 0.0111591	total: 10.5s	remaining: 23.5s
92:	learn: 0.0109452	total: 10.6s	remaining: 23.4s
93:	learn: 0.0108146	total: 10.7s	remaining: 23.1s
94:	learn: 0.0105614	total: 10.7s	remaining: 23s
95:	learn: 0.0104814	total: 10.8s	remaining: 22.8s
96:	learn: 0.0103338	total: 10.9s	remaining: 22.7s
97:	learn: 0.0101650	total: 11s	remaining: 22.5s
98:	learn: 0.0100205	total: 11.1s	remaining: 22.3s
99:	learn: 0.0098358	total: 11.2s	remaining: 22.2s
100:	learn: 0.0097103	total: 11.3s	remaining: 22s
101:	learn: 0.0096433	total: 11.4s	remaining: 21.9s
102:	learn: 0.0094757	total: 11.5s	remaining: 21.7s
103:	learn: 0.0093289	total: 11.5s	remaining: 21.5s
104:	learn: 0.0091799	total: 11.6s	remaining: 21.4s
105:	learn: 0.0089962	total: 11.7s	remaining: 21.2s
106:	learn: 0.0088599	total: 11.8s	remaining: 21.1s
107:	learn: 0.0087008	total: 11.9s	remaining: 21s
108:	learn: 0.0086553	total: 12s	remaining: 20.8s
109:	learn: 0.0085761	total: 12.1s	remaining: 20.6s
110:	learn: 0.0084924	total: 12.2s	remaining: 20.5s
111:	learn: 0.0083757	total: 12.2s	remaining: 20.3s
112:	learn: 0.0082412	total: 12.3s	remaining: 20.2s
113:	learn: 0.0081272	total: 12.4s	remaining: 20.1s
114:	learn: 0.0080155	total: 12.5s	remaining: 19.9s
115:	learn: 0.0078672	total: 12.6s	remaining: 19.8s
116:	learn: 0.0077976	total: 12.7s	remaining: 19.7s
117:	learn: 0.0076658	total: 12.8s	remaining: 19.5s
118:	learn: 0.0075607	total: 12.9s	remaining: 19.4s
119:	learn: 0.0074310	total: 13s	remaining: 19.3s
120:	learn: 0.0073412	total: 13.1s	remaining: 19.1s
121:	learn: 0.0072755	total: 13.1s	remaining: 19s
122:	learn: 0.0072109	total: 13.2s	remaining: 18.8s
123:	learn: 0.0070427	total: 13.3s	remaining: 18.7s
124:	learn: 0.0069775	total: 13.4s	remaining: 18.6s
125:	learn: 0.0068685	total: 13.5s	remaining: 18.4s
126:	learn: 0.0067633	total: 13.6s	remaining: 18.3s
127:	learn: 0.0066853	total: 13.7s	remaining: 18.2s
128:	learn: 0.0065830	total: 13.8s	remaining: 18s
129:	learn: 0.0065312	total: 13.9s	remaining: 17.9s
130:	learn: 0.0064548	total: 13.9s	remaining: 17.8s
131:	learn: 0.0064241	total: 14s	remaining: 17.6s
132:	learn: 0.0063895	total: 14.1s	remaining: 17.5s
133:	learn: 0.0063156	total: 14.2s	remaining: 17.4s
134:	learn: 0.0062153	total: 14.3s	remaining: 17.2s
135:	learn: 0.0061681	total: 14.4s	remaining: 17.1s
136:	learn: 0.0060844	total: 14.5s	remaining: 17s
137:	learn: 0.0060161	total: 14.6s	remaining: 16.9s
138:	learn: 0.0059580	total: 14.6s	remaining: 16.8s
139:	learn: 0.0058553	total: 14.7s	remaining: 16.6s
140:	learn: 0.0058139	total: 14.8s	remaining: 16.5s
141:	learn: 0.0057533	total: 14.9s	remaining: 16.4s
142:	learn: 0.0056769	total: 15s	remaining: 16.3s
143:	learn: 0.0056195	total: 15.1s	remaining: 16.2s
144:	learn: 0.0055874	total: 15.2s	remaining: 16s
145:	learn: 0.0055331	total: 15.3s	remaining: 15.9s
146:	learn: 0.0055102	total: 15.4s	remaining: 15.8s
147:	learn: 0.0054492	total: 15.4s	remaining: 15.7s
148:	learn: 0.0053827	total: 15.5s	remaining: 15.5s
149:	learn: 0.0053235	total: 15.6s	remaining: 15.4s
150:	learn: 0.0052310	total: 15.7s	remaining: 15.3s
151:	learn: 0.0051896	total: 15.8s	remaining: 15.2s
152:	learn: 0.0051894	total: 15.9s	remaining: 15.1s
153:	learn: 0.0051453	total: 16s	remaining: 14.9s
154:	learn: 0.0050818	total: 16s	remaining: 14.8s
155:	learn: 0.0050492	total: 16.2s	remaining: 14.7s
156:	learn: 0.0049800	total: 16.2s	remaining: 14.6s
157:	learn: 0.0049542	total: 16.4s	remaining: 14.5s
158:	learn: 0.0048910	total: 16.5s	remaining: 14.5s
159:	learn: 0.0048573	total: 16.7s	remaining: 14.4s
160:	learn: 0.0048089	total: 16.8s	remaining: 14.3s
161:	learn: 0.0047903	total: 17s	remaining: 14.2s
162:	learn: 0.0047415	total: 17.1s	remaining: 14.2s
163:	learn: 0.0046729	total: 17.3s	remaining: 14.1s
164:	learn: 0.0046729	total: 17.4s	remaining: 14s
165:	learn: 0.0046372	total: 17.6s	remaining: 14s
166:	learn: 0.0045980	total: 17.7s	remaining: 13.9s
167:	learn: 0.0045779	total: 17.9s	remaining: 13.8s
168:	learn: 0.0045322	total: 18s	remaining: 13.8s
169:	learn: 0.0045019	total: 18.2s	remaining: 13.7s
170:	learn: 0.0045019	total: 18.3s	remaining: 13.6s
171:	learn: 0.0045019	total: 18.4s	remaining: 13.5s
172:	learn: 0.0045019	total: 18.6s	remaining: 13.4s
173:	learn: 0.0045019	total: 18.7s	remaining: 13.3s
174:	learn: 0.0044685	total: 18.8s	remaining: 13.2s
175:	learn: 0.0044231	total: 19s	remaining: 13.2s
176:	learn: 0.0044231	total: 19.1s	remaining: 13.1s
177:	learn: 0.0044231	total: 19.2s	remaining: 13s
178:	learn: 0.0044231	total: 19.4s	remaining: 12.9s
179:	learn: 0.0044231	total: 19.5s	remaining: 12.8s
180:	learn: 0.0044231	total: 19.6s	remaining: 12.7s
181:	learn: 0.0043825	total: 19.8s	remaining: 12.6s
182:	learn: 0.0043464	total: 19.9s	remaining: 12.5s
183:	learn: 0.0043111	total: 20.1s	remaining: 12.5s
184:	learn: 0.0042855	total: 20.2s	remaining: 12.4s
185:	learn: 0.0042223	total: 20.4s	remaining: 12.3s
186:	learn: 0.0042031	total: 20.5s	remaining: 12.2s
187:	learn: 0.0041315	total: 20.7s	remaining: 12.1s
188:	learn: 0.0041314	total: 20.9s	remaining: 12s
189:	learn: 0.0041156	total: 21s	remaining: 11.9s
190:	learn: 0.0040875	total: 21.2s	remaining: 11.9s
191:	learn: 0.0040715	total: 21.3s	remaining: 11.8s
192:	learn: 0.0040715	total: 21.4s	remaining: 11.7s
193:	learn: 0.0040714	total: 21.5s	remaining: 11.6s
194:	learn: 0.0040713	total: 21.7s	remaining: 11.4s
195:	learn: 0.0040432	total: 21.8s	remaining: 11.4s
196:	learn: 0.0040272	total: 22s	remaining: 11.3s
197:	learn: 0.0039795	total: 22.1s	remaining: 11.2s
198:	learn: 0.0039476	total: 22.3s	remaining: 11.1s
199:	learn: 0.0039157	total: 22.3s	remaining: 10.9s
200:	learn: 0.0039038	total: 22.4s	remaining: 10.8s
201:	learn: 0.0038881	total: 22.5s	remaining: 10.7s
202:	learn: 0.0038563	total: 22.6s	remaining: 10.6s
203:	learn: 0.0038305	total: 22.7s	remaining: 10.4s
204:	learn: 0.0037892	total: 22.8s	remaining: 10.3s
205:	learn: 0.0037635	total: 22.9s	remaining: 10.2s
206:	learn: 0.0037334	total: 23s	remaining: 10.1s
207:	learn: 0.0036867	total: 23s	remaining: 9.97s
208:	learn: 0.0036866	total: 23.1s	remaining: 9.85s
209:	learn: 0.0036866	total: 23.2s	remaining: 9.72s
210:	learn: 0.0036866	total: 23.3s	remaining: 9.59s
211:	learn: 0.0036866	total: 23.3s	remaining: 9.46s
212:	learn: 0.0036866	total: 23.4s	remaining: 9.34s
213:	learn: 0.0036866	total: 23.5s	remaining: 9.21s
214:	learn: 0.0036866	total: 23.5s	remaining: 9.08s
215:	learn: 0.0036866	total: 23.6s	remaining: 8.95s
216:	learn: 0.0036866	total: 23.7s	remaining: 8.83s
217:	learn: 0.0036866	total: 23.7s	remaining: 8.7s
218:	learn: 0.0036256	total: 23.8s	remaining: 8.59s
219:	learn: 0.0035748	total: 23.9s	remaining: 8.49s
220:	learn: 0.0035748	total: 24s	remaining: 8.36s
221:	learn: 0.0035748	total: 24.1s	remaining: 8.24s
222:	learn: 0.0035748	total: 24.1s	remaining: 8.11s
223:	learn: 0.0035748	total: 24.2s	remaining: 7.99s
224:	learn: 0.0035748	total: 24.3s	remaining: 7.87s
225:	learn: 0.0035748	total: 24.3s	remaining: 7.75s
226:	learn: 0.0035748	total: 24.4s	remaining: 7.62s
227:	learn: 0.0035748	total: 24.5s	remaining: 7.51s
228:	learn: 0.0035748	total: 24.5s	remaining: 7.38s
229:	learn: 0.0035747	total: 24.6s	remaining: 7.26s
230:	learn: 0.0035747	total: 24.6s	remaining: 7.14s
231:	learn: 0.0035747	total: 24.7s	remaining: 7.03s
232:	learn: 0.0035747	total: 24.8s	remaining: 6.91s
233:	learn: 0.0035747	total: 24.8s	remaining: 6.79s
234:	learn: 0.0035747	total: 24.9s	remaining: 6.68s
235:	learn: 0.0035747	total: 25s	remaining: 6.57s
236:	learn: 0.0035747	total: 25.1s	remaining: 6.45s
237:	learn: 0.0035746	total: 25.1s	remaining: 6.33s
238:	learn: 0.0035746	total: 25.2s	remaining: 6.22s
239:	learn: 0.0035746	total: 25.3s	remaining: 6.1s
240:	learn: 0.0035746	total: 25.3s	remaining: 5.99s
241:	learn: 0.0035746	total: 25.4s	remaining: 5.88s
242:	learn: 0.0035746	total: 25.5s	remaining: 5.77s
243:	learn: 0.0035746	total: 25.5s	remaining: 5.65s
244:	learn: 0.0035746	total: 25.6s	remaining: 5.54s
245:	learn: 0.0035746	total: 25.7s	remaining: 5.43s
246:	learn: 0.0035746	total: 25.7s	remaining: 5.31s
247:	learn: 0.0035746	total: 25.8s	remaining: 5.2s
248:	learn: 0.0035746	total: 25.9s	remaining: 5.09s
249:	learn: 0.0035746	total: 26s	remaining: 4.99s
250:	learn: 0.0035745	total: 26s	remaining: 4.87s
251:	learn: 0.0035746	total: 26.1s	remaining: 4.76s
252:	learn: 0.0035746	total: 26.1s	remaining: 4.65s
253:	learn: 0.0035745	total: 26.2s	remaining: 4.54s
254:	learn: 0.0035745	total: 26.3s	remaining: 4.43s
255:	learn: 0.0035746	total: 26.3s	remaining: 4.32s
256:	learn: 0.0035746	total: 26.4s	remaining: 4.21s
257:	learn: 0.0035746	total: 26.5s	remaining: 4.11s
258:	learn: 0.0035745	total: 26.6s	remaining: 4s
259:	learn: 0.0035745	total: 26.6s	remaining: 3.89s
260:	learn: 0.0035745	total: 26.7s	remaining: 3.78s
261:	learn: 0.0035745	total: 26.7s	remaining: 3.67s
262:	learn: 0.0035745	total: 26.8s	remaining: 3.57s
263:	learn: 0.0035745	total: 26.9s	remaining: 3.46s
264:	learn: 0.0035745	total: 26.9s	remaining: 3.35s
265:	learn: 0.0035745	total: 27s	remaining: 3.25s
266:	learn: 0.0035745	total: 27.1s	remaining: 3.15s
267:	learn: 0.0035745	total: 27.2s	remaining: 3.04s
268:	learn: 0.0035745	total: 27.2s	remaining: 2.93s
269:	learn: 0.0035745	total: 27.3s	remaining: 2.83s
270:	learn: 0.0035745	total: 27.4s	remaining: 2.73s
271:	learn: 0.0035745	total: 27.4s	remaining: 2.62s
272:	learn: 0.0035745	total: 27.5s	remaining: 2.52s
273:	learn: 0.0035745	total: 27.6s	remaining: 2.41s
274:	learn: 0.0035745	total: 27.6s	remaining: 2.31s
275:	learn: 0.0035745	total: 27.7s	remaining: 2.21s
276:	learn: 0.0035745	total: 27.7s	remaining: 2.1s
277:	learn: 0.0035745	total: 27.8s	remaining: 2s
278:	learn: 0.0035745	total: 27.9s	remaining: 1.9s
279:	learn: 0.0035745	total: 27.9s	remaining: 1.8s
280:	learn: 0.0035745	total: 28s	remaining: 1.69s
281:	learn: 0.0035745	total: 28.1s	remaining: 1.59s
282:	learn: 0.0035745	total: 28.2s	remaining: 1.49s
283:	learn: 0.0035745	total: 28.2s	remaining: 1.39s
284:	learn: 0.0035745	total: 28.3s	remaining: 1.29s
285:	learn: 0.0035744	total: 28.4s	remaining: 1.19s
286:	learn: 0.0035744	total: 28.4s	remaining: 1.09s
287:	learn: 0.0035744	total: 28.5s	remaining: 989ms
288:	learn: 0.0035744	total: 28.5s	remaining: 889ms
289:	learn: 0.0035744	total: 28.6s	remaining: 790ms
290:	learn: 0.0035744	total: 28.7s	remaining: 690ms
291:	learn: 0.0035744	total: 28.7s	remaining: 591ms
292:	learn: 0.0035744	total: 28.8s	remaining: 492ms
293:	learn: 0.0035744	total: 28.9s	remaining: 393ms
294:	learn: 0.0035744	total: 29s	remaining: 294ms
295:	learn: 0.0035744	total: 29s	remaining: 196ms
296:	learn: 0.0035744	total: 29.1s	remaining: 97.9ms
297:	learn: 0.0035744	total: 29.2s	remaining: 0us
[I 2024-12-19 14:53:49,060] Trial 36 finished with value: 78.14508014095021 and parameters: {'learning_rate': 0.0982110654729356, 'max_depth': 5, 'n_estimators': 298, 'scale_pos_weight': 5.190090911358252}. Best is trial 33 with value: 80.40946207404374.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.73
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 29.67
 - Recall_Test: 85.71
 - AUPRC_Test: 78.19
 - Accuracy_Test: 99.63
 - F1-Score_Test: 44.08
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 5
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 298
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.19
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 78.1451

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4657781	total: 88ms	remaining: 26s
1:	learn: 0.2921732	total: 183ms	remaining: 26.9s
2:	learn: 0.2078858	total: 272ms	remaining: 26.6s
3:	learn: 0.1551787	total: 393ms	remaining: 28.7s
4:	learn: 0.1243450	total: 490ms	remaining: 28.5s
5:	learn: 0.0990054	total: 575ms	remaining: 27.8s
6:	learn: 0.0831818	total: 697ms	remaining: 28.8s
7:	learn: 0.0745376	total: 863ms	remaining: 31.1s
8:	learn: 0.0691002	total: 1.02s	remaining: 32.5s
9:	learn: 0.0632613	total: 1.18s	remaining: 33.6s
10:	learn: 0.0578949	total: 1.36s	remaining: 35.2s
11:	learn: 0.0540285	total: 1.53s	remaining: 36.2s
12:	learn: 0.0498692	total: 1.69s	remaining: 36.7s
13:	learn: 0.0459005	total: 1.85s	remaining: 37.3s
14:	learn: 0.0426071	total: 2.03s	remaining: 38.1s
15:	learn: 0.0406701	total: 2.2s	remaining: 38.5s
16:	learn: 0.0391018	total: 2.37s	remaining: 39s
17:	learn: 0.0366432	total: 2.54s	remaining: 39.3s
18:	learn: 0.0350275	total: 2.7s	remaining: 39.4s
19:	learn: 0.0333905	total: 2.89s	remaining: 39.9s
20:	learn: 0.0317616	total: 3.06s	remaining: 40.1s
21:	learn: 0.0302918	total: 3.23s	remaining: 40.2s
22:	learn: 0.0287634	total: 3.42s	remaining: 40.6s
23:	learn: 0.0275072	total: 3.6s	remaining: 40.7s
24:	learn: 0.0265839	total: 3.77s	remaining: 40.8s
25:	learn: 0.0258464	total: 3.94s	remaining: 41s
26:	learn: 0.0250170	total: 4.12s	remaining: 41.1s
27:	learn: 0.0239036	total: 4.3s	remaining: 41.2s
28:	learn: 0.0230226	total: 4.47s	remaining: 41.2s
29:	learn: 0.0222376	total: 4.64s	remaining: 41.1s
30:	learn: 0.0213488	total: 4.82s	remaining: 41.2s
31:	learn: 0.0204040	total: 4.99s	remaining: 41.1s
32:	learn: 0.0195543	total: 5.16s	remaining: 41.1s
33:	learn: 0.0188813	total: 5.33s	remaining: 41.1s
34:	learn: 0.0183341	total: 5.51s	remaining: 41.1s
35:	learn: 0.0177877	total: 5.67s	remaining: 40.9s
36:	learn: 0.0171937	total: 5.87s	remaining: 41.1s
37:	learn: 0.0167043	total: 6.05s	remaining: 41.1s
38:	learn: 0.0160219	total: 6.16s	remaining: 40.6s
39:	learn: 0.0157790	total: 6.25s	remaining: 40s
40:	learn: 0.0152782	total: 6.35s	remaining: 39.5s
41:	learn: 0.0148135	total: 6.48s	remaining: 39.2s
42:	learn: 0.0143942	total: 6.58s	remaining: 38.7s
43:	learn: 0.0140782	total: 6.66s	remaining: 38.2s
44:	learn: 0.0137020	total: 6.77s	remaining: 37.8s
45:	learn: 0.0133000	total: 6.86s	remaining: 37.3s
46:	learn: 0.0129796	total: 6.95s	remaining: 36.8s
47:	learn: 0.0127481	total: 7.08s	remaining: 36.6s
48:	learn: 0.0122810	total: 7.16s	remaining: 36.1s
49:	learn: 0.0119886	total: 7.25s	remaining: 35.7s
50:	learn: 0.0116687	total: 7.37s	remaining: 35.4s
51:	learn: 0.0113423	total: 7.46s	remaining: 35s
52:	learn: 0.0109321	total: 7.55s	remaining: 34.6s
53:	learn: 0.0106172	total: 7.66s	remaining: 34.3s
54:	learn: 0.0102921	total: 7.75s	remaining: 34s
55:	learn: 0.0099657	total: 7.85s	remaining: 33.6s
56:	learn: 0.0096262	total: 7.96s	remaining: 33.4s
57:	learn: 0.0094793	total: 8.06s	remaining: 33.1s
58:	learn: 0.0092797	total: 8.15s	remaining: 32.8s
59:	learn: 0.0090159	total: 8.27s	remaining: 32.5s
60:	learn: 0.0088698	total: 8.37s	remaining: 32.2s
61:	learn: 0.0087035	total: 8.45s	remaining: 31.9s
62:	learn: 0.0085721	total: 8.59s	remaining: 31.8s
63:	learn: 0.0083431	total: 8.67s	remaining: 31.4s
64:	learn: 0.0082442	total: 8.76s	remaining: 31.1s
65:	learn: 0.0080338	total: 8.86s	remaining: 30.9s
66:	learn: 0.0078057	total: 8.95s	remaining: 30.6s
67:	learn: 0.0076424	total: 9.04s	remaining: 30.3s
68:	learn: 0.0075252	total: 9.15s	remaining: 30.1s
69:	learn: 0.0073636	total: 9.24s	remaining: 29.8s
70:	learn: 0.0072305	total: 9.33s	remaining: 29.6s
71:	learn: 0.0071233	total: 9.44s	remaining: 29.4s
72:	learn: 0.0070168	total: 9.53s	remaining: 29.1s
73:	learn: 0.0068725	total: 9.61s	remaining: 28.8s
74:	learn: 0.0067042	total: 9.72s	remaining: 28.6s
75:	learn: 0.0066185	total: 9.8s	remaining: 28.4s
76:	learn: 0.0065306	total: 9.88s	remaining: 28.1s
77:	learn: 0.0064107	total: 9.98s	remaining: 27.9s
78:	learn: 0.0062073	total: 10.1s	remaining: 27.7s
79:	learn: 0.0060944	total: 10.2s	remaining: 27.4s
80:	learn: 0.0059814	total: 10.3s	remaining: 27.3s
81:	learn: 0.0059336	total: 10.4s	remaining: 27s
82:	learn: 0.0058493	total: 10.5s	remaining: 26.8s
83:	learn: 0.0058021	total: 10.6s	remaining: 26.6s
84:	learn: 0.0057208	total: 10.6s	remaining: 26.4s
85:	learn: 0.0056222	total: 10.7s	remaining: 26.2s
86:	learn: 0.0055601	total: 10.8s	remaining: 26s
87:	learn: 0.0054564	total: 10.9s	remaining: 25.8s
88:	learn: 0.0053688	total: 11s	remaining: 25.6s
89:	learn: 0.0052993	total: 11.1s	remaining: 25.5s
90:	learn: 0.0052476	total: 11.2s	remaining: 25.2s
91:	learn: 0.0051581	total: 11.3s	remaining: 25.1s
92:	learn: 0.0051339	total: 11.4s	remaining: 24.9s
93:	learn: 0.0050310	total: 11.5s	remaining: 24.7s
94:	learn: 0.0048903	total: 11.6s	remaining: 24.5s
95:	learn: 0.0048262	total: 11.7s	remaining: 24.4s
96:	learn: 0.0047466	total: 11.8s	remaining: 24.2s
97:	learn: 0.0046875	total: 11.9s	remaining: 24s
98:	learn: 0.0046336	total: 12s	remaining: 23.9s
99:	learn: 0.0045855	total: 12.1s	remaining: 23.7s
100:	learn: 0.0045625	total: 12.2s	remaining: 23.5s
101:	learn: 0.0045129	total: 12.3s	remaining: 23.4s
102:	learn: 0.0044329	total: 12.5s	remaining: 23.4s
103:	learn: 0.0043799	total: 12.6s	remaining: 23.3s
104:	learn: 0.0043269	total: 12.8s	remaining: 23.2s
105:	learn: 0.0042663	total: 12.9s	remaining: 23.1s
106:	learn: 0.0041921	total: 13s	remaining: 22.9s
107:	learn: 0.0041921	total: 13s	remaining: 22.7s
108:	learn: 0.0041658	total: 13.1s	remaining: 22.6s
109:	learn: 0.0040946	total: 13.2s	remaining: 22.4s
110:	learn: 0.0040558	total: 13.4s	remaining: 22.3s
111:	learn: 0.0040405	total: 13.5s	remaining: 22.1s
112:	learn: 0.0039927	total: 13.6s	remaining: 22s
113:	learn: 0.0039501	total: 13.7s	remaining: 21.8s
114:	learn: 0.0039162	total: 13.8s	remaining: 21.6s
115:	learn: 0.0038491	total: 13.8s	remaining: 21.5s
116:	learn: 0.0037648	total: 13.9s	remaining: 21.3s
117:	learn: 0.0037174	total: 14s	remaining: 21.2s
118:	learn: 0.0037174	total: 14.1s	remaining: 21s
119:	learn: 0.0037043	total: 14.2s	remaining: 20.8s
120:	learn: 0.0036803	total: 14.3s	remaining: 20.7s
121:	learn: 0.0036549	total: 14.4s	remaining: 20.5s
122:	learn: 0.0036420	total: 14.5s	remaining: 20.4s
123:	learn: 0.0036222	total: 14.6s	remaining: 20.2s
124:	learn: 0.0036222	total: 14.6s	remaining: 20s
125:	learn: 0.0035161	total: 14.8s	remaining: 19.9s
126:	learn: 0.0034900	total: 14.8s	remaining: 19.7s
127:	learn: 0.0034899	total: 14.9s	remaining: 19.6s
128:	learn: 0.0034697	total: 15s	remaining: 19.4s
129:	learn: 0.0034697	total: 15.1s	remaining: 19.2s
130:	learn: 0.0034470	total: 15.2s	remaining: 19.1s
131:	learn: 0.0034470	total: 15.2s	remaining: 18.9s
132:	learn: 0.0033829	total: 15.3s	remaining: 18.8s
133:	learn: 0.0033213	total: 15.5s	remaining: 18.7s
134:	learn: 0.0033212	total: 15.5s	remaining: 18.5s
135:	learn: 0.0033100	total: 15.6s	remaining: 18.4s
136:	learn: 0.0032888	total: 15.7s	remaining: 18.3s
137:	learn: 0.0032888	total: 15.8s	remaining: 18.1s
138:	learn: 0.0032550	total: 15.9s	remaining: 17.9s
139:	learn: 0.0032293	total: 16s	remaining: 17.8s
140:	learn: 0.0032133	total: 16.1s	remaining: 17.7s
141:	learn: 0.0032132	total: 16.2s	remaining: 17.6s
142:	learn: 0.0031921	total: 16.4s	remaining: 17.5s
143:	learn: 0.0031777	total: 16.6s	remaining: 17.5s
144:	learn: 0.0031777	total: 16.7s	remaining: 17.4s
145:	learn: 0.0031777	total: 16.8s	remaining: 17.3s
146:	learn: 0.0031776	total: 17s	remaining: 17.2s
147:	learn: 0.0031776	total: 17.1s	remaining: 17.1s
148:	learn: 0.0031776	total: 17.3s	remaining: 17s
149:	learn: 0.0031776	total: 17.4s	remaining: 17s
150:	learn: 0.0031776	total: 17.6s	remaining: 16.9s
151:	learn: 0.0031776	total: 17.7s	remaining: 16.8s
152:	learn: 0.0031706	total: 17.9s	remaining: 16.7s
153:	learn: 0.0031554	total: 18s	remaining: 16.6s
154:	learn: 0.0031555	total: 18.1s	remaining: 16.5s
155:	learn: 0.0031555	total: 18.3s	remaining: 16.4s
156:	learn: 0.0031555	total: 18.4s	remaining: 16.3s
157:	learn: 0.0031554	total: 18.6s	remaining: 16.2s
158:	learn: 0.0031554	total: 18.7s	remaining: 16.1s
159:	learn: 0.0031554	total: 18.8s	remaining: 16s
160:	learn: 0.0031554	total: 19s	remaining: 15.9s
161:	learn: 0.0031554	total: 19.1s	remaining: 15.8s
162:	learn: 0.0031554	total: 19.2s	remaining: 15.7s
163:	learn: 0.0031554	total: 19.4s	remaining: 15.6s
164:	learn: 0.0031554	total: 19.5s	remaining: 15.5s
165:	learn: 0.0031554	total: 19.7s	remaining: 15.4s
166:	learn: 0.0031554	total: 19.8s	remaining: 15.3s
167:	learn: 0.0031554	total: 19.9s	remaining: 15.2s
168:	learn: 0.0031554	total: 20.1s	remaining: 15.1s
169:	learn: 0.0031554	total: 20.2s	remaining: 15s
170:	learn: 0.0031554	total: 20.3s	remaining: 14.9s
171:	learn: 0.0031554	total: 20.4s	remaining: 14.7s
172:	learn: 0.0031554	total: 20.5s	remaining: 14.6s
173:	learn: 0.0031553	total: 20.7s	remaining: 14.5s
174:	learn: 0.0031553	total: 20.8s	remaining: 14.4s
175:	learn: 0.0031553	total: 21s	remaining: 14.3s
176:	learn: 0.0031553	total: 21.1s	remaining: 14.2s
177:	learn: 0.0031553	total: 21.2s	remaining: 14.1s
178:	learn: 0.0031553	total: 21.4s	remaining: 14s
179:	learn: 0.0031553	total: 21.5s	remaining: 13.9s
180:	learn: 0.0031553	total: 21.6s	remaining: 13.8s
181:	learn: 0.0031553	total: 21.8s	remaining: 13.7s
182:	learn: 0.0031553	total: 21.9s	remaining: 13.5s
183:	learn: 0.0031553	total: 22.1s	remaining: 13.5s
184:	learn: 0.0031553	total: 22.2s	remaining: 13.3s
185:	learn: 0.0031553	total: 22.3s	remaining: 13.2s
186:	learn: 0.0031553	total: 22.4s	remaining: 13.1s
187:	learn: 0.0031553	total: 22.5s	remaining: 12.9s
188:	learn: 0.0031370	total: 22.6s	remaining: 12.8s
189:	learn: 0.0031370	total: 22.7s	remaining: 12.6s
190:	learn: 0.0031370	total: 22.8s	remaining: 12.5s
191:	learn: 0.0031370	total: 22.8s	remaining: 12.4s
192:	learn: 0.0031370	total: 22.9s	remaining: 12.2s
193:	learn: 0.0031370	total: 23s	remaining: 12.1s
194:	learn: 0.0031370	total: 23.1s	remaining: 11.9s
195:	learn: 0.0031370	total: 23.1s	remaining: 11.8s
196:	learn: 0.0031370	total: 23.2s	remaining: 11.7s
197:	learn: 0.0031370	total: 23.3s	remaining: 11.5s
198:	learn: 0.0031370	total: 23.4s	remaining: 11.4s
199:	learn: 0.0031370	total: 23.4s	remaining: 11.2s
200:	learn: 0.0031370	total: 23.5s	remaining: 11.1s
201:	learn: 0.0031370	total: 23.6s	remaining: 11s
202:	learn: 0.0031370	total: 23.7s	remaining: 10.8s
203:	learn: 0.0031370	total: 23.8s	remaining: 10.7s
204:	learn: 0.0031370	total: 23.8s	remaining: 10.6s
205:	learn: 0.0031370	total: 23.9s	remaining: 10.4s
206:	learn: 0.0031370	total: 24s	remaining: 10.3s
207:	learn: 0.0031370	total: 24.1s	remaining: 10.2s
208:	learn: 0.0031370	total: 24.1s	remaining: 10s
209:	learn: 0.0031370	total: 24.2s	remaining: 9.92s
210:	learn: 0.0031370	total: 24.3s	remaining: 9.79s
211:	learn: 0.0031370	total: 24.4s	remaining: 9.66s
212:	learn: 0.0031370	total: 24.4s	remaining: 9.53s
213:	learn: 0.0031370	total: 24.6s	remaining: 9.41s
214:	learn: 0.0031370	total: 24.6s	remaining: 9.28s
215:	learn: 0.0031370	total: 24.7s	remaining: 9.16s
216:	learn: 0.0031370	total: 24.8s	remaining: 9.03s
217:	learn: 0.0031370	total: 24.9s	remaining: 8.9s
218:	learn: 0.0031370	total: 25s	remaining: 8.78s
219:	learn: 0.0031370	total: 25s	remaining: 8.65s
220:	learn: 0.0031370	total: 25.1s	remaining: 8.52s
221:	learn: 0.0031370	total: 25.2s	remaining: 8.39s
222:	learn: 0.0031370	total: 25.3s	remaining: 8.27s
223:	learn: 0.0031370	total: 25.3s	remaining: 8.15s
224:	learn: 0.0031370	total: 25.4s	remaining: 8.03s
225:	learn: 0.0031370	total: 25.5s	remaining: 7.9s
226:	learn: 0.0031369	total: 25.6s	remaining: 7.77s
227:	learn: 0.0031369	total: 25.7s	remaining: 7.65s
228:	learn: 0.0031369	total: 25.7s	remaining: 7.53s
229:	learn: 0.0031369	total: 25.8s	remaining: 7.4s
230:	learn: 0.0031369	total: 25.9s	remaining: 7.29s
231:	learn: 0.0031369	total: 26s	remaining: 7.16s
232:	learn: 0.0031369	total: 26s	remaining: 7.04s
233:	learn: 0.0031369	total: 26.1s	remaining: 6.92s
234:	learn: 0.0031369	total: 26.2s	remaining: 6.8s
235:	learn: 0.0031369	total: 26.3s	remaining: 6.68s
236:	learn: 0.0031369	total: 26.4s	remaining: 6.56s
237:	learn: 0.0031369	total: 26.4s	remaining: 6.44s
238:	learn: 0.0031369	total: 26.5s	remaining: 6.32s
239:	learn: 0.0031369	total: 26.6s	remaining: 6.2s
240:	learn: 0.0031369	total: 26.7s	remaining: 6.09s
241:	learn: 0.0031369	total: 26.7s	remaining: 5.97s
242:	learn: 0.0031369	total: 26.9s	remaining: 5.86s
243:	learn: 0.0031369	total: 26.9s	remaining: 5.74s
244:	learn: 0.0031369	total: 27s	remaining: 5.62s
245:	learn: 0.0031369	total: 27.1s	remaining: 5.5s
246:	learn: 0.0031369	total: 27.2s	remaining: 5.39s
247:	learn: 0.0031369	total: 27.2s	remaining: 5.27s
248:	learn: 0.0031369	total: 27.3s	remaining: 5.16s
249:	learn: 0.0031369	total: 27.4s	remaining: 5.04s
250:	learn: 0.0031369	total: 27.5s	remaining: 4.92s
251:	learn: 0.0031369	total: 27.5s	remaining: 4.81s
252:	learn: 0.0031369	total: 27.6s	remaining: 4.69s
253:	learn: 0.0031369	total: 27.7s	remaining: 4.58s
254:	learn: 0.0031369	total: 27.8s	remaining: 4.47s
255:	learn: 0.0031369	total: 27.9s	remaining: 4.36s
256:	learn: 0.0031369	total: 28s	remaining: 4.24s
257:	learn: 0.0031369	total: 28.1s	remaining: 4.13s
258:	learn: 0.0031369	total: 28.1s	remaining: 4.02s
259:	learn: 0.0031369	total: 28.2s	remaining: 3.9s
260:	learn: 0.0031369	total: 28.3s	remaining: 3.79s
261:	learn: 0.0031369	total: 28.4s	remaining: 3.68s
262:	learn: 0.0031369	total: 28.4s	remaining: 3.57s
263:	learn: 0.0031369	total: 28.5s	remaining: 3.46s
264:	learn: 0.0031369	total: 28.6s	remaining: 3.34s
265:	learn: 0.0031369	total: 28.7s	remaining: 3.23s
266:	learn: 0.0031369	total: 28.7s	remaining: 3.12s
267:	learn: 0.0031369	total: 28.8s	remaining: 3.01s
268:	learn: 0.0031369	total: 28.9s	remaining: 2.9s
269:	learn: 0.0031369	total: 29s	remaining: 2.79s
270:	learn: 0.0031369	total: 29.1s	remaining: 2.68s
271:	learn: 0.0031369	total: 29.1s	remaining: 2.57s
272:	learn: 0.0031369	total: 29.2s	remaining: 2.46s
273:	learn: 0.0031369	total: 29.3s	remaining: 2.35s
274:	learn: 0.0031369	total: 29.4s	remaining: 2.24s
275:	learn: 0.0031369	total: 29.4s	remaining: 2.13s
276:	learn: 0.0031369	total: 29.5s	remaining: 2.02s
277:	learn: 0.0031369	total: 29.6s	remaining: 1.92s
278:	learn: 0.0031369	total: 29.7s	remaining: 1.81s
279:	learn: 0.0031369	total: 29.7s	remaining: 1.7s
280:	learn: 0.0031369	total: 29.8s	remaining: 1.59s
281:	learn: 0.0031369	total: 29.9s	remaining: 1.48s
282:	learn: 0.0031369	total: 30s	remaining: 1.38s
283:	learn: 0.0031369	total: 30.1s	remaining: 1.27s
284:	learn: 0.0031369	total: 30.2s	remaining: 1.16s
285:	learn: 0.0031369	total: 30.2s	remaining: 1.06s
286:	learn: 0.0031369	total: 30.3s	remaining: 950ms
287:	learn: 0.0031369	total: 30.4s	remaining: 844ms
288:	learn: 0.0031369	total: 30.5s	remaining: 738ms
289:	learn: 0.0031369	total: 30.5s	remaining: 632ms
290:	learn: 0.0031369	total: 30.6s	remaining: 526ms
291:	learn: 0.0031369	total: 30.7s	remaining: 420ms
292:	learn: 0.0031369	total: 30.8s	remaining: 315ms
293:	learn: 0.0031369	total: 30.8s	remaining: 210ms
294:	learn: 0.0031369	total: 30.9s	remaining: 105ms
295:	learn: 0.0031369	total: 31s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.74
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.87
 - F1-Score_Train: 99.87
 - Precision_Test: 30.51
 - Recall_Test: 85.71
 - AUPRC_Test: 80.41
 - Accuracy_Test: 99.65
 - F1-Score_Test: 45.00
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.02
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4825753	total: 185ms	remaining: 54.5s
1:	learn: 0.3384528	total: 350ms	remaining: 51.5s
2:	learn: 0.2506294	total: 521ms	remaining: 50.9s
3:	learn: 0.1984111	total: 690ms	remaining: 50.4s
4:	learn: 0.1611385	total: 842ms	remaining: 49s
5:	learn: 0.1382490	total: 1.02s	remaining: 49.5s
6:	learn: 0.1221695	total: 1.21s	remaining: 49.8s
7:	learn: 0.1127194	total: 1.38s	remaining: 49.8s
8:	learn: 0.1013016	total: 1.57s	remaining: 49.9s
9:	learn: 0.0926677	total: 1.75s	remaining: 50.1s
10:	learn: 0.0863344	total: 1.94s	remaining: 50.2s
11:	learn: 0.0817438	total: 2.12s	remaining: 50.3s
12:	learn: 0.0752214	total: 2.3s	remaining: 50.1s
13:	learn: 0.0718759	total: 2.47s	remaining: 49.8s
14:	learn: 0.0685945	total: 2.67s	remaining: 50s
15:	learn: 0.0625532	total: 2.85s	remaining: 49.9s
16:	learn: 0.0590290	total: 3.03s	remaining: 49.7s
17:	learn: 0.0556078	total: 3.2s	remaining: 49.5s
18:	learn: 0.0528231	total: 3.38s	remaining: 49.2s
19:	learn: 0.0500795	total: 3.57s	remaining: 49.3s
20:	learn: 0.0478658	total: 3.75s	remaining: 49.1s
21:	learn: 0.0462442	total: 3.91s	remaining: 48.7s
22:	learn: 0.0442768	total: 4.08s	remaining: 48.4s
23:	learn: 0.0418699	total: 4.23s	remaining: 47.9s
24:	learn: 0.0403504	total: 4.34s	remaining: 47s
25:	learn: 0.0384238	total: 4.43s	remaining: 46s
26:	learn: 0.0362808	total: 4.53s	remaining: 45.1s
27:	learn: 0.0346960	total: 4.65s	remaining: 44.5s
28:	learn: 0.0335250	total: 4.75s	remaining: 43.7s
29:	learn: 0.0322501	total: 4.84s	remaining: 42.9s
30:	learn: 0.0312324	total: 4.98s	remaining: 42.6s
31:	learn: 0.0303861	total: 5.08s	remaining: 41.9s
32:	learn: 0.0293174	total: 5.16s	remaining: 41.2s
33:	learn: 0.0281964	total: 5.28s	remaining: 40.7s
34:	learn: 0.0273100	total: 5.36s	remaining: 40s
35:	learn: 0.0262062	total: 5.46s	remaining: 39.5s
36:	learn: 0.0254692	total: 5.57s	remaining: 39s
37:	learn: 0.0249429	total: 5.67s	remaining: 38.5s
38:	learn: 0.0241742	total: 5.76s	remaining: 38s
39:	learn: 0.0233163	total: 5.88s	remaining: 37.7s
40:	learn: 0.0227633	total: 5.97s	remaining: 37.2s
41:	learn: 0.0219048	total: 6.07s	remaining: 36.7s
42:	learn: 0.0212190	total: 6.18s	remaining: 36.4s
43:	learn: 0.0209056	total: 6.27s	remaining: 35.9s
44:	learn: 0.0202229	total: 6.36s	remaining: 35.5s
45:	learn: 0.0195601	total: 6.48s	remaining: 35.2s
46:	learn: 0.0190133	total: 6.57s	remaining: 34.8s
47:	learn: 0.0185244	total: 6.66s	remaining: 34.4s
48:	learn: 0.0179391	total: 6.79s	remaining: 34.2s
49:	learn: 0.0175215	total: 6.87s	remaining: 33.8s
50:	learn: 0.0171830	total: 6.95s	remaining: 33.4s
51:	learn: 0.0167575	total: 7.07s	remaining: 33.2s
52:	learn: 0.0165293	total: 7.15s	remaining: 32.8s
53:	learn: 0.0163238	total: 7.23s	remaining: 32.4s
54:	learn: 0.0158509	total: 7.35s	remaining: 32.2s
55:	learn: 0.0154892	total: 7.44s	remaining: 31.9s
56:	learn: 0.0150816	total: 7.53s	remaining: 31.6s
57:	learn: 0.0147455	total: 7.64s	remaining: 31.4s
58:	learn: 0.0144515	total: 7.73s	remaining: 31s
59:	learn: 0.0140588	total: 7.83s	remaining: 30.8s
60:	learn: 0.0137552	total: 7.95s	remaining: 30.6s
61:	learn: 0.0135238	total: 8.02s	remaining: 30.3s
62:	learn: 0.0131859	total: 8.12s	remaining: 30s
63:	learn: 0.0130225	total: 8.22s	remaining: 29.8s
64:	learn: 0.0127311	total: 8.32s	remaining: 29.6s
65:	learn: 0.0125957	total: 8.41s	remaining: 29.3s
66:	learn: 0.0123824	total: 8.54s	remaining: 29.2s
67:	learn: 0.0119471	total: 8.64s	remaining: 29s
68:	learn: 0.0117364	total: 8.73s	remaining: 28.7s
69:	learn: 0.0114435	total: 8.86s	remaining: 28.6s
70:	learn: 0.0112800	total: 8.94s	remaining: 28.3s
71:	learn: 0.0110050	total: 9.03s	remaining: 28.1s
72:	learn: 0.0107546	total: 9.14s	remaining: 27.9s
73:	learn: 0.0106468	total: 9.22s	remaining: 27.7s
74:	learn: 0.0104579	total: 9.31s	remaining: 27.4s
75:	learn: 0.0102830	total: 9.41s	remaining: 27.3s
76:	learn: 0.0101818	total: 9.5s	remaining: 27s
77:	learn: 0.0099439	total: 9.6s	remaining: 26.8s
78:	learn: 0.0097234	total: 9.71s	remaining: 26.7s
79:	learn: 0.0095665	total: 9.8s	remaining: 26.5s
80:	learn: 0.0094519	total: 9.89s	remaining: 26.3s
81:	learn: 0.0092358	total: 10s	remaining: 26.1s
82:	learn: 0.0091969	total: 10.1s	remaining: 25.9s
83:	learn: 0.0089766	total: 10.2s	remaining: 25.7s
84:	learn: 0.0088940	total: 10.3s	remaining: 25.5s
85:	learn: 0.0087877	total: 10.4s	remaining: 25.3s
86:	learn: 0.0086937	total: 10.5s	remaining: 25.1s
87:	learn: 0.0084329	total: 10.6s	remaining: 25s
88:	learn: 0.0083208	total: 10.7s	remaining: 24.8s
89:	learn: 0.0082103	total: 10.7s	remaining: 24.6s
90:	learn: 0.0081212	total: 10.9s	remaining: 24.5s
91:	learn: 0.0080484	total: 11s	remaining: 24.3s
92:	learn: 0.0079655	total: 11s	remaining: 24.1s
93:	learn: 0.0078675	total: 11.2s	remaining: 24s
94:	learn: 0.0077300	total: 11.2s	remaining: 23.8s
95:	learn: 0.0076033	total: 11.3s	remaining: 23.6s
96:	learn: 0.0074873	total: 11.4s	remaining: 23.5s
97:	learn: 0.0073791	total: 11.5s	remaining: 23.3s
98:	learn: 0.0072029	total: 11.6s	remaining: 23.1s
99:	learn: 0.0070294	total: 11.7s	remaining: 23s
100:	learn: 0.0068225	total: 11.8s	remaining: 22.8s
101:	learn: 0.0067920	total: 11.9s	remaining: 22.7s
102:	learn: 0.0066360	total: 12s	remaining: 22.6s
103:	learn: 0.0065954	total: 12.1s	remaining: 22.4s
104:	learn: 0.0065163	total: 12.2s	remaining: 22.2s
105:	learn: 0.0064622	total: 12.3s	remaining: 22.1s
106:	learn: 0.0064029	total: 12.4s	remaining: 21.9s
107:	learn: 0.0062411	total: 12.5s	remaining: 21.8s
108:	learn: 0.0061717	total: 12.6s	remaining: 21.7s
109:	learn: 0.0060465	total: 12.7s	remaining: 21.5s
110:	learn: 0.0059909	total: 12.8s	remaining: 21.3s
111:	learn: 0.0059442	total: 12.9s	remaining: 21.3s
112:	learn: 0.0058533	total: 13s	remaining: 21.1s
113:	learn: 0.0057324	total: 13.1s	remaining: 20.9s
114:	learn: 0.0057020	total: 13.2s	remaining: 20.8s
115:	learn: 0.0057019	total: 13.3s	remaining: 20.6s
116:	learn: 0.0056703	total: 13.4s	remaining: 20.5s
117:	learn: 0.0056225	total: 13.5s	remaining: 20.3s
118:	learn: 0.0054664	total: 13.6s	remaining: 20.2s
119:	learn: 0.0052823	total: 13.7s	remaining: 20.1s
120:	learn: 0.0052233	total: 13.8s	remaining: 19.9s
121:	learn: 0.0051315	total: 13.9s	remaining: 19.8s
122:	learn: 0.0050690	total: 14s	remaining: 19.6s
123:	learn: 0.0049807	total: 14.1s	remaining: 19.5s
124:	learn: 0.0049583	total: 14.2s	remaining: 19.4s
125:	learn: 0.0049116	total: 14.3s	remaining: 19.3s
126:	learn: 0.0048244	total: 14.5s	remaining: 19.3s
127:	learn: 0.0047557	total: 14.7s	remaining: 19.2s
128:	learn: 0.0046834	total: 14.8s	remaining: 19.2s
129:	learn: 0.0046423	total: 15s	remaining: 19.1s
130:	learn: 0.0045766	total: 15.1s	remaining: 19.1s
131:	learn: 0.0045374	total: 15.3s	remaining: 19s
132:	learn: 0.0044712	total: 15.5s	remaining: 19s
133:	learn: 0.0043894	total: 15.7s	remaining: 19s
134:	learn: 0.0043710	total: 15.9s	remaining: 18.9s
135:	learn: 0.0043015	total: 16.1s	remaining: 18.9s
136:	learn: 0.0042583	total: 16.2s	remaining: 18.8s
137:	learn: 0.0042345	total: 16.4s	remaining: 18.8s
138:	learn: 0.0041603	total: 16.6s	remaining: 18.7s
139:	learn: 0.0040924	total: 16.8s	remaining: 18.7s
140:	learn: 0.0040586	total: 16.9s	remaining: 18.6s
141:	learn: 0.0039567	total: 17.1s	remaining: 18.6s
142:	learn: 0.0039389	total: 17.3s	remaining: 18.5s
143:	learn: 0.0038858	total: 17.5s	remaining: 18.4s
144:	learn: 0.0038787	total: 17.6s	remaining: 18.3s
145:	learn: 0.0038481	total: 17.8s	remaining: 18.3s
146:	learn: 0.0038413	total: 17.9s	remaining: 18.2s
147:	learn: 0.0038413	total: 18.1s	remaining: 18.1s
148:	learn: 0.0038139	total: 18.2s	remaining: 18s
149:	learn: 0.0037960	total: 18.4s	remaining: 17.9s
150:	learn: 0.0037492	total: 18.6s	remaining: 17.8s
151:	learn: 0.0036960	total: 18.7s	remaining: 17.7s
152:	learn: 0.0036961	total: 18.9s	remaining: 17.6s
153:	learn: 0.0036960	total: 19s	remaining: 17.5s
154:	learn: 0.0036766	total: 19.2s	remaining: 17.5s
155:	learn: 0.0036594	total: 19.4s	remaining: 17.4s
156:	learn: 0.0036362	total: 19.5s	remaining: 17.3s
157:	learn: 0.0036118	total: 19.7s	remaining: 17.2s
158:	learn: 0.0035944	total: 19.8s	remaining: 17.1s
159:	learn: 0.0035444	total: 20s	remaining: 17s
160:	learn: 0.0034842	total: 20.2s	remaining: 16.9s
161:	learn: 0.0034841	total: 20.3s	remaining: 16.8s
162:	learn: 0.0034549	total: 20.4s	remaining: 16.7s
163:	learn: 0.0034548	total: 20.5s	remaining: 16.5s
164:	learn: 0.0034548	total: 20.6s	remaining: 16.4s
165:	learn: 0.0034548	total: 20.7s	remaining: 16.2s
166:	learn: 0.0034300	total: 20.8s	remaining: 16s
167:	learn: 0.0034162	total: 20.9s	remaining: 15.9s
168:	learn: 0.0033716	total: 21s	remaining: 15.7s
169:	learn: 0.0033299	total: 21s	remaining: 15.6s
170:	learn: 0.0033080	total: 21.2s	remaining: 15.5s
171:	learn: 0.0032697	total: 21.3s	remaining: 15.3s
172:	learn: 0.0032130	total: 21.4s	remaining: 15.2s
173:	learn: 0.0032130	total: 21.4s	remaining: 15s
174:	learn: 0.0031711	total: 21.5s	remaining: 14.9s
175:	learn: 0.0031711	total: 21.6s	remaining: 14.7s
176:	learn: 0.0031711	total: 21.7s	remaining: 14.6s
177:	learn: 0.0031711	total: 21.8s	remaining: 14.4s
178:	learn: 0.0031711	total: 21.8s	remaining: 14.3s
179:	learn: 0.0031711	total: 21.9s	remaining: 14.1s
180:	learn: 0.0031711	total: 22s	remaining: 14s
181:	learn: 0.0031301	total: 22.1s	remaining: 13.8s
182:	learn: 0.0030922	total: 22.2s	remaining: 13.7s
183:	learn: 0.0030836	total: 22.3s	remaining: 13.6s
184:	learn: 0.0030659	total: 22.4s	remaining: 13.4s
185:	learn: 0.0030285	total: 22.5s	remaining: 13.3s
186:	learn: 0.0030284	total: 22.6s	remaining: 13.1s
187:	learn: 0.0029863	total: 22.7s	remaining: 13s
188:	learn: 0.0029862	total: 22.7s	remaining: 12.9s
189:	learn: 0.0029863	total: 22.8s	remaining: 12.7s
190:	learn: 0.0029863	total: 22.9s	remaining: 12.6s
191:	learn: 0.0029514	total: 23s	remaining: 12.5s
192:	learn: 0.0029514	total: 23.1s	remaining: 12.3s
193:	learn: 0.0029514	total: 23.1s	remaining: 12.2s
194:	learn: 0.0029514	total: 23.2s	remaining: 12s
195:	learn: 0.0029513	total: 23.3s	remaining: 11.9s
196:	learn: 0.0029513	total: 23.4s	remaining: 11.8s
197:	learn: 0.0029513	total: 23.5s	remaining: 11.6s
198:	learn: 0.0029513	total: 23.6s	remaining: 11.5s
199:	learn: 0.0029513	total: 23.6s	remaining: 11.4s
200:	learn: 0.0029513	total: 23.7s	remaining: 11.2s
201:	learn: 0.0029512	total: 23.8s	remaining: 11.1s
202:	learn: 0.0029512	total: 23.9s	remaining: 11s
203:	learn: 0.0029512	total: 24s	remaining: 10.8s
204:	learn: 0.0029511	total: 24.1s	remaining: 10.7s
205:	learn: 0.0029511	total: 24.2s	remaining: 10.6s
206:	learn: 0.0029511	total: 24.3s	remaining: 10.4s
207:	learn: 0.0029511	total: 24.3s	remaining: 10.3s
208:	learn: 0.0029511	total: 24.4s	remaining: 10.2s
209:	learn: 0.0029511	total: 24.5s	remaining: 10s
210:	learn: 0.0029511	total: 24.6s	remaining: 9.91s
211:	learn: 0.0029511	total: 24.7s	remaining: 9.78s
212:	learn: 0.0029511	total: 24.8s	remaining: 9.65s
213:	learn: 0.0029511	total: 24.8s	remaining: 9.52s
214:	learn: 0.0029511	total: 24.9s	remaining: 9.39s
215:	learn: 0.0029511	total: 25s	remaining: 9.27s
216:	learn: 0.0029511	total: 25.1s	remaining: 9.14s
217:	learn: 0.0029511	total: 25.2s	remaining: 9.01s
218:	learn: 0.0029510	total: 25.3s	remaining: 8.88s
219:	learn: 0.0029510	total: 25.3s	remaining: 8.76s
220:	learn: 0.0029510	total: 25.4s	remaining: 8.63s
221:	learn: 0.0029510	total: 25.5s	remaining: 8.51s
222:	learn: 0.0029510	total: 25.6s	remaining: 8.38s
223:	learn: 0.0029510	total: 25.7s	remaining: 8.25s
224:	learn: 0.0029510	total: 25.8s	remaining: 8.13s
225:	learn: 0.0029510	total: 25.8s	remaining: 8s
226:	learn: 0.0029510	total: 25.9s	remaining: 7.88s
227:	learn: 0.0029510	total: 26s	remaining: 7.76s
228:	learn: 0.0029510	total: 26.1s	remaining: 7.63s
229:	learn: 0.0029509	total: 26.2s	remaining: 7.51s
230:	learn: 0.0029509	total: 26.3s	remaining: 7.39s
231:	learn: 0.0029509	total: 26.3s	remaining: 7.26s
232:	learn: 0.0029509	total: 26.4s	remaining: 7.14s
233:	learn: 0.0029509	total: 26.5s	remaining: 7.02s
234:	learn: 0.0029509	total: 26.6s	remaining: 6.9s
235:	learn: 0.0029509	total: 26.7s	remaining: 6.78s
236:	learn: 0.0029508	total: 26.8s	remaining: 6.66s
237:	learn: 0.0029508	total: 26.8s	remaining: 6.54s
238:	learn: 0.0029508	total: 26.9s	remaining: 6.42s
239:	learn: 0.0029507	total: 27s	remaining: 6.3s
240:	learn: 0.0029508	total: 27.1s	remaining: 6.18s
241:	learn: 0.0029507	total: 27.2s	remaining: 6.06s
242:	learn: 0.0029506	total: 27.3s	remaining: 5.95s
243:	learn: 0.0029506	total: 27.3s	remaining: 5.83s
244:	learn: 0.0029506	total: 27.4s	remaining: 5.71s
245:	learn: 0.0029506	total: 27.5s	remaining: 5.59s
246:	learn: 0.0029506	total: 27.6s	remaining: 5.47s
247:	learn: 0.0029506	total: 27.7s	remaining: 5.35s
248:	learn: 0.0029506	total: 27.8s	remaining: 5.24s
249:	learn: 0.0029506	total: 27.8s	remaining: 5.12s
250:	learn: 0.0029505	total: 27.9s	remaining: 5s
251:	learn: 0.0029505	total: 28s	remaining: 4.89s
252:	learn: 0.0029505	total: 28.1s	remaining: 4.77s
253:	learn: 0.0029421	total: 28.2s	remaining: 4.66s
254:	learn: 0.0029114	total: 28.3s	remaining: 4.55s
255:	learn: 0.0029114	total: 28.3s	remaining: 4.43s
256:	learn: 0.0028755	total: 28.4s	remaining: 4.32s
257:	learn: 0.0028755	total: 28.5s	remaining: 4.2s
258:	learn: 0.0028755	total: 28.6s	remaining: 4.09s
259:	learn: 0.0028755	total: 28.7s	remaining: 3.97s
260:	learn: 0.0028755	total: 28.8s	remaining: 3.86s
261:	learn: 0.0028755	total: 28.9s	remaining: 3.74s
262:	learn: 0.0028755	total: 28.9s	remaining: 3.63s
263:	learn: 0.0028755	total: 29s	remaining: 3.52s
264:	learn: 0.0028755	total: 29.1s	remaining: 3.4s
265:	learn: 0.0028755	total: 29.2s	remaining: 3.29s
266:	learn: 0.0028755	total: 29.2s	remaining: 3.17s
267:	learn: 0.0028755	total: 29.3s	remaining: 3.06s
268:	learn: 0.0028755	total: 29.4s	remaining: 2.95s
269:	learn: 0.0028755	total: 29.5s	remaining: 2.84s
270:	learn: 0.0028754	total: 29.6s	remaining: 2.73s
271:	learn: 0.0028755	total: 29.6s	remaining: 2.62s
272:	learn: 0.0028754	total: 29.7s	remaining: 2.5s
273:	learn: 0.0028754	total: 29.8s	remaining: 2.39s
274:	learn: 0.0028754	total: 29.9s	remaining: 2.28s
275:	learn: 0.0028754	total: 30s	remaining: 2.17s
276:	learn: 0.0028754	total: 30.1s	remaining: 2.06s
277:	learn: 0.0028754	total: 30.1s	remaining: 1.95s
278:	learn: 0.0028754	total: 30.2s	remaining: 1.84s
279:	learn: 0.0028754	total: 30.4s	remaining: 1.73s
280:	learn: 0.0028754	total: 30.5s	remaining: 1.63s
281:	learn: 0.0028754	total: 30.6s	remaining: 1.52s
282:	learn: 0.0028754	total: 30.8s	remaining: 1.41s
283:	learn: 0.0028754	total: 30.9s	remaining: 1.3s
284:	learn: 0.0028754	total: 31s	remaining: 1.2s
285:	learn: 0.0028754	total: 31.1s	remaining: 1.09s
286:	learn: 0.0028754	total: 31.3s	remaining: 981ms
287:	learn: 0.0028754	total: 31.4s	remaining: 873ms
288:	learn: 0.0028754	total: 31.6s	remaining: 764ms
289:	learn: 0.0028754	total: 31.7s	remaining: 656ms
290:	learn: 0.0028754	total: 31.9s	remaining: 547ms
291:	learn: 0.0028754	total: 32s	remaining: 438ms
292:	learn: 0.0028754	total: 32.1s	remaining: 329ms
293:	learn: 0.0028754	total: 32.3s	remaining: 220ms
294:	learn: 0.0028754	total: 32.4s	remaining: 110ms
295:	learn: 0.0028753	total: 32.5s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.80
 - Recall_Train: 100.00
 - AUPRC_Train: 100.00
 - Accuracy_Train: 99.90
 - F1-Score_Train: 99.90
 - Precision_Test: 34.81
 - Recall_Test: 87.30
 - AUPRC_Test: 80.40
 - Accuracy_Test: 99.70
 - F1-Score_Test: 49.77
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.02
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4359159	total: 84.5ms	remaining: 24.9s
1:	learn: 0.3189518	total: 174ms	remaining: 25.6s
2:	learn: 0.2321196	total: 263ms	remaining: 25.7s
3:	learn: 0.1838186	total: 368ms	remaining: 26.8s
4:	learn: 0.1475061	total: 460ms	remaining: 26.7s
5:	learn: 0.1266086	total: 546ms	remaining: 26.4s
6:	learn: 0.1111262	total: 693ms	remaining: 28.6s
7:	learn: 0.0995944	total: 777ms	remaining: 28s
8:	learn: 0.0901875	total: 861ms	remaining: 27.5s
9:	learn: 0.0809338	total: 990ms	remaining: 28.3s
10:	learn: 0.0763262	total: 1.08s	remaining: 28s
11:	learn: 0.0716878	total: 1.17s	remaining: 27.6s
12:	learn: 0.0673562	total: 1.28s	remaining: 27.8s
13:	learn: 0.0615995	total: 1.37s	remaining: 27.6s
14:	learn: 0.0581191	total: 1.46s	remaining: 27.4s
15:	learn: 0.0552882	total: 1.57s	remaining: 27.5s
16:	learn: 0.0520574	total: 1.66s	remaining: 27.3s
17:	learn: 0.0482479	total: 1.76s	remaining: 27.2s
18:	learn: 0.0443876	total: 1.88s	remaining: 27.4s
19:	learn: 0.0432277	total: 1.97s	remaining: 27.1s
20:	learn: 0.0405648	total: 2.08s	remaining: 27.2s
21:	learn: 0.0394139	total: 2.18s	remaining: 27.2s
22:	learn: 0.0374973	total: 2.27s	remaining: 26.9s
23:	learn: 0.0355704	total: 2.36s	remaining: 26.7s
24:	learn: 0.0346471	total: 2.47s	remaining: 26.8s
25:	learn: 0.0333309	total: 2.56s	remaining: 26.6s
26:	learn: 0.0318789	total: 2.65s	remaining: 26.4s
27:	learn: 0.0305308	total: 2.77s	remaining: 26.5s
28:	learn: 0.0289777	total: 2.87s	remaining: 26.4s
29:	learn: 0.0278818	total: 2.96s	remaining: 26.3s
30:	learn: 0.0265712	total: 3.09s	remaining: 26.5s
31:	learn: 0.0258568	total: 3.18s	remaining: 26.2s
32:	learn: 0.0249977	total: 3.27s	remaining: 26.1s
33:	learn: 0.0242204	total: 3.39s	remaining: 26.1s
34:	learn: 0.0237320	total: 3.47s	remaining: 25.9s
35:	learn: 0.0230428	total: 3.56s	remaining: 25.7s
36:	learn: 0.0220287	total: 3.67s	remaining: 25.7s
37:	learn: 0.0211960	total: 3.76s	remaining: 25.6s
38:	learn: 0.0207228	total: 3.85s	remaining: 25.4s
39:	learn: 0.0199971	total: 3.97s	remaining: 25.4s
40:	learn: 0.0194354	total: 4.07s	remaining: 25.3s
41:	learn: 0.0186877	total: 4.17s	remaining: 25.2s
42:	learn: 0.0181276	total: 4.29s	remaining: 25.2s
43:	learn: 0.0177834	total: 4.37s	remaining: 25s
44:	learn: 0.0173500	total: 4.47s	remaining: 24.9s
45:	learn: 0.0167264	total: 4.61s	remaining: 25s
46:	learn: 0.0163989	total: 4.7s	remaining: 24.9s
47:	learn: 0.0158271	total: 4.79s	remaining: 24.7s
48:	learn: 0.0155090	total: 4.9s	remaining: 24.7s
49:	learn: 0.0150562	total: 4.99s	remaining: 24.5s
50:	learn: 0.0145917	total: 5.08s	remaining: 24.4s
51:	learn: 0.0140962	total: 5.21s	remaining: 24.4s
52:	learn: 0.0135890	total: 5.31s	remaining: 24.3s
53:	learn: 0.0132035	total: 5.4s	remaining: 24.2s
54:	learn: 0.0127910	total: 5.51s	remaining: 24.2s
55:	learn: 0.0125607	total: 5.6s	remaining: 24s
56:	learn: 0.0122116	total: 5.69s	remaining: 23.9s
57:	learn: 0.0118535	total: 5.81s	remaining: 23.8s
58:	learn: 0.0115753	total: 5.89s	remaining: 23.7s
59:	learn: 0.0113038	total: 5.99s	remaining: 23.6s
60:	learn: 0.0110687	total: 6.09s	remaining: 23.5s
61:	learn: 0.0107810	total: 6.21s	remaining: 23.4s
62:	learn: 0.0106096	total: 6.3s	remaining: 23.3s
63:	learn: 0.0103978	total: 6.41s	remaining: 23.2s
64:	learn: 0.0101993	total: 6.51s	remaining: 23.1s
65:	learn: 0.0099112	total: 6.61s	remaining: 23s
66:	learn: 0.0097193	total: 6.71s	remaining: 22.9s
67:	learn: 0.0094680	total: 6.82s	remaining: 22.9s
68:	learn: 0.0093355	total: 6.9s	remaining: 22.7s
69:	learn: 0.0091749	total: 6.99s	remaining: 22.6s
70:	learn: 0.0089500	total: 7.1s	remaining: 22.5s
71:	learn: 0.0086952	total: 7.2s	remaining: 22.4s
72:	learn: 0.0085075	total: 7.29s	remaining: 22.3s
73:	learn: 0.0083510	total: 7.4s	remaining: 22.2s
74:	learn: 0.0081858	total: 7.5s	remaining: 22.1s
75:	learn: 0.0081197	total: 7.58s	remaining: 21.9s
76:	learn: 0.0080699	total: 7.69s	remaining: 21.9s
77:	learn: 0.0079572	total: 7.77s	remaining: 21.7s
78:	learn: 0.0078145	total: 7.86s	remaining: 21.6s
79:	learn: 0.0075636	total: 7.98s	remaining: 21.5s
80:	learn: 0.0074339	total: 8.07s	remaining: 21.4s
81:	learn: 0.0073569	total: 8.15s	remaining: 21.3s
82:	learn: 0.0072017	total: 8.28s	remaining: 21.2s
83:	learn: 0.0070786	total: 8.37s	remaining: 21.1s
84:	learn: 0.0069994	total: 8.46s	remaining: 21s
85:	learn: 0.0069030	total: 8.59s	remaining: 21s
86:	learn: 0.0068177	total: 8.68s	remaining: 20.8s
87:	learn: 0.0066762	total: 8.77s	remaining: 20.7s
88:	learn: 0.0065824	total: 8.88s	remaining: 20.7s
89:	learn: 0.0064563	total: 8.97s	remaining: 20.5s
90:	learn: 0.0062936	total: 9.06s	remaining: 20.4s
91:	learn: 0.0062553	total: 9.16s	remaining: 20.3s
92:	learn: 0.0061885	total: 9.23s	remaining: 20.2s
93:	learn: 0.0061118	total: 9.33s	remaining: 20.1s
94:	learn: 0.0059873	total: 9.45s	remaining: 20s
95:	learn: 0.0058144	total: 9.54s	remaining: 19.9s
96:	learn: 0.0056611	total: 9.63s	remaining: 19.8s
97:	learn: 0.0055877	total: 9.75s	remaining: 19.7s
98:	learn: 0.0055343	total: 9.83s	remaining: 19.6s
99:	learn: 0.0054856	total: 9.92s	remaining: 19.4s
100:	learn: 0.0054172	total: 10s	remaining: 19.4s
101:	learn: 0.0053083	total: 10.2s	remaining: 19.4s
102:	learn: 0.0052320	total: 10.4s	remaining: 19.4s
103:	learn: 0.0051309	total: 10.5s	remaining: 19.5s
104:	learn: 0.0050833	total: 10.7s	remaining: 19.5s
105:	learn: 0.0050479	total: 10.9s	remaining: 19.5s
106:	learn: 0.0049992	total: 11.1s	remaining: 19.5s
107:	learn: 0.0049176	total: 11.2s	remaining: 19.5s
108:	learn: 0.0048731	total: 11.4s	remaining: 19.5s
109:	learn: 0.0048082	total: 11.6s	remaining: 19.6s
110:	learn: 0.0047639	total: 11.7s	remaining: 19.6s
111:	learn: 0.0046760	total: 11.9s	remaining: 19.5s
112:	learn: 0.0046760	total: 12s	remaining: 19.5s
113:	learn: 0.0046760	total: 12.2s	remaining: 19.4s
114:	learn: 0.0046384	total: 12.3s	remaining: 19.4s
115:	learn: 0.0045944	total: 12.5s	remaining: 19.4s
116:	learn: 0.0045581	total: 12.6s	remaining: 19.3s
117:	learn: 0.0044672	total: 12.8s	remaining: 19.3s
118:	learn: 0.0043819	total: 13s	remaining: 19.3s
119:	learn: 0.0043106	total: 13.2s	remaining: 19.3s
120:	learn: 0.0042676	total: 13.3s	remaining: 19.3s
121:	learn: 0.0042041	total: 13.5s	remaining: 19.3s
122:	learn: 0.0041495	total: 13.7s	remaining: 19.3s
123:	learn: 0.0041033	total: 13.9s	remaining: 19.2s
124:	learn: 0.0041033	total: 14s	remaining: 19.2s
125:	learn: 0.0040856	total: 14.2s	remaining: 19.1s
126:	learn: 0.0040495	total: 14.3s	remaining: 19.1s
127:	learn: 0.0039612	total: 14.5s	remaining: 19s
128:	learn: 0.0039611	total: 14.6s	remaining: 18.9s
129:	learn: 0.0039343	total: 14.8s	remaining: 18.9s
130:	learn: 0.0039343	total: 14.9s	remaining: 18.8s
131:	learn: 0.0038914	total: 15.1s	remaining: 18.8s
132:	learn: 0.0038575	total: 15.3s	remaining: 18.7s
133:	learn: 0.0038384	total: 15.4s	remaining: 18.7s
134:	learn: 0.0038263	total: 15.6s	remaining: 18.6s
135:	learn: 0.0037807	total: 15.8s	remaining: 18.5s
136:	learn: 0.0037277	total: 15.9s	remaining: 18.5s
137:	learn: 0.0036725	total: 16s	remaining: 18.3s
138:	learn: 0.0036725	total: 16.1s	remaining: 18.2s
139:	learn: 0.0036724	total: 16.2s	remaining: 18s
140:	learn: 0.0036400	total: 16.2s	remaining: 17.9s
141:	learn: 0.0036118	total: 16.3s	remaining: 17.7s
142:	learn: 0.0035983	total: 16.4s	remaining: 17.6s
143:	learn: 0.0035437	total: 16.5s	remaining: 17.4s
144:	learn: 0.0035231	total: 16.6s	remaining: 17.3s
145:	learn: 0.0034949	total: 16.7s	remaining: 17.2s
146:	learn: 0.0034301	total: 16.8s	remaining: 17s
147:	learn: 0.0033921	total: 16.9s	remaining: 16.9s
148:	learn: 0.0033067	total: 17s	remaining: 16.8s
149:	learn: 0.0032793	total: 17.1s	remaining: 16.6s
150:	learn: 0.0032588	total: 17.2s	remaining: 16.5s
151:	learn: 0.0032343	total: 17.3s	remaining: 16.4s
152:	learn: 0.0031721	total: 17.4s	remaining: 16.3s
153:	learn: 0.0031562	total: 17.5s	remaining: 16.1s
154:	learn: 0.0031102	total: 17.6s	remaining: 16s
155:	learn: 0.0030824	total: 17.7s	remaining: 15.9s
156:	learn: 0.0030348	total: 17.8s	remaining: 15.8s
157:	learn: 0.0029955	total: 17.9s	remaining: 15.7s
158:	learn: 0.0029954	total: 18s	remaining: 15.5s
159:	learn: 0.0029954	total: 18.1s	remaining: 15.3s
160:	learn: 0.0029954	total: 18.1s	remaining: 15.2s
161:	learn: 0.0029797	total: 18.2s	remaining: 15.1s
162:	learn: 0.0029797	total: 18.3s	remaining: 14.9s
163:	learn: 0.0029797	total: 18.4s	remaining: 14.8s
164:	learn: 0.0029797	total: 18.4s	remaining: 14.6s
165:	learn: 0.0029797	total: 18.5s	remaining: 14.5s
166:	learn: 0.0029797	total: 18.6s	remaining: 14.4s
167:	learn: 0.0029797	total: 18.7s	remaining: 14.2s
168:	learn: 0.0029797	total: 18.7s	remaining: 14.1s
169:	learn: 0.0029797	total: 18.8s	remaining: 14s
170:	learn: 0.0029797	total: 18.9s	remaining: 13.8s
171:	learn: 0.0029797	total: 19s	remaining: 13.7s
172:	learn: 0.0029797	total: 19s	remaining: 13.5s
173:	learn: 0.0029797	total: 19.1s	remaining: 13.4s
174:	learn: 0.0029797	total: 19.2s	remaining: 13.3s
175:	learn: 0.0029797	total: 19.2s	remaining: 13.1s
176:	learn: 0.0029797	total: 19.3s	remaining: 13s
177:	learn: 0.0029797	total: 19.4s	remaining: 12.9s
178:	learn: 0.0029797	total: 19.5s	remaining: 12.7s
179:	learn: 0.0029797	total: 19.5s	remaining: 12.6s
180:	learn: 0.0029797	total: 19.6s	remaining: 12.4s
181:	learn: 0.0029797	total: 19.7s	remaining: 12.3s
182:	learn: 0.0029797	total: 19.7s	remaining: 12.2s
183:	learn: 0.0029797	total: 19.8s	remaining: 12.1s
184:	learn: 0.0029797	total: 19.9s	remaining: 11.9s
185:	learn: 0.0029797	total: 20s	remaining: 11.8s
186:	learn: 0.0029797	total: 20s	remaining: 11.7s
187:	learn: 0.0029797	total: 20.1s	remaining: 11.6s
188:	learn: 0.0029796	total: 20.2s	remaining: 11.4s
189:	learn: 0.0029796	total: 20.3s	remaining: 11.3s
190:	learn: 0.0029796	total: 20.3s	remaining: 11.2s
191:	learn: 0.0029796	total: 20.4s	remaining: 11.1s
192:	learn: 0.0029796	total: 20.5s	remaining: 10.9s
193:	learn: 0.0029796	total: 20.6s	remaining: 10.8s
194:	learn: 0.0029796	total: 20.6s	remaining: 10.7s
195:	learn: 0.0029796	total: 20.7s	remaining: 10.6s
196:	learn: 0.0029796	total: 20.8s	remaining: 10.4s
197:	learn: 0.0029796	total: 20.9s	remaining: 10.3s
198:	learn: 0.0029796	total: 20.9s	remaining: 10.2s
199:	learn: 0.0029796	total: 21s	remaining: 10.1s
200:	learn: 0.0029796	total: 21.1s	remaining: 9.98s
201:	learn: 0.0029796	total: 21.2s	remaining: 9.85s
202:	learn: 0.0029796	total: 21.2s	remaining: 9.73s
203:	learn: 0.0029488	total: 21.3s	remaining: 9.62s
204:	learn: 0.0029488	total: 21.4s	remaining: 9.51s
205:	learn: 0.0029488	total: 21.5s	remaining: 9.39s
206:	learn: 0.0029488	total: 21.6s	remaining: 9.27s
207:	learn: 0.0029488	total: 21.6s	remaining: 9.15s
208:	learn: 0.0029488	total: 21.7s	remaining: 9.03s
209:	learn: 0.0029488	total: 21.8s	remaining: 8.92s
210:	learn: 0.0029488	total: 21.9s	remaining: 8.81s
211:	learn: 0.0029488	total: 21.9s	remaining: 8.69s
212:	learn: 0.0029488	total: 22s	remaining: 8.58s
213:	learn: 0.0029488	total: 22.1s	remaining: 8.46s
214:	learn: 0.0029488	total: 22.2s	remaining: 8.35s
215:	learn: 0.0029488	total: 22.2s	remaining: 8.23s
216:	learn: 0.0029488	total: 22.3s	remaining: 8.12s
217:	learn: 0.0029488	total: 22.4s	remaining: 8.01s
218:	learn: 0.0029488	total: 22.5s	remaining: 7.9s
219:	learn: 0.0029488	total: 22.5s	remaining: 7.78s
220:	learn: 0.0029488	total: 22.6s	remaining: 7.67s
221:	learn: 0.0029488	total: 22.7s	remaining: 7.55s
222:	learn: 0.0029488	total: 22.8s	remaining: 7.45s
223:	learn: 0.0029227	total: 22.8s	remaining: 7.34s
224:	learn: 0.0029227	total: 22.9s	remaining: 7.23s
225:	learn: 0.0029227	total: 23s	remaining: 7.12s
226:	learn: 0.0029227	total: 23.1s	remaining: 7.01s
227:	learn: 0.0029227	total: 23.1s	remaining: 6.9s
228:	learn: 0.0029227	total: 23.2s	remaining: 6.79s
229:	learn: 0.0029227	total: 23.3s	remaining: 6.68s
230:	learn: 0.0029227	total: 23.4s	remaining: 6.57s
231:	learn: 0.0029227	total: 23.4s	remaining: 6.46s
232:	learn: 0.0029227	total: 23.5s	remaining: 6.35s
233:	learn: 0.0029227	total: 23.6s	remaining: 6.25s
234:	learn: 0.0029227	total: 23.6s	remaining: 6.14s
235:	learn: 0.0029227	total: 23.7s	remaining: 6.03s
236:	learn: 0.0029227	total: 23.8s	remaining: 5.92s
237:	learn: 0.0029227	total: 23.9s	remaining: 5.82s
238:	learn: 0.0029227	total: 24s	remaining: 5.71s
239:	learn: 0.0029227	total: 24s	remaining: 5.61s
240:	learn: 0.0029227	total: 24.1s	remaining: 5.5s
241:	learn: 0.0029227	total: 24.2s	remaining: 5.39s
242:	learn: 0.0029227	total: 24.2s	remaining: 5.29s
243:	learn: 0.0029227	total: 24.3s	remaining: 5.18s
244:	learn: 0.0029227	total: 24.4s	remaining: 5.08s
245:	learn: 0.0029227	total: 24.5s	remaining: 4.98s
246:	learn: 0.0029227	total: 24.6s	remaining: 4.87s
247:	learn: 0.0029227	total: 24.6s	remaining: 4.77s
248:	learn: 0.0029227	total: 24.7s	remaining: 4.66s
249:	learn: 0.0029227	total: 24.8s	remaining: 4.56s
250:	learn: 0.0029227	total: 24.8s	remaining: 4.45s
251:	learn: 0.0029227	total: 24.9s	remaining: 4.35s
252:	learn: 0.0029227	total: 25s	remaining: 4.25s
253:	learn: 0.0029227	total: 25.1s	remaining: 4.14s
254:	learn: 0.0029227	total: 25.1s	remaining: 4.04s
255:	learn: 0.0029227	total: 25.2s	remaining: 3.94s
256:	learn: 0.0029227	total: 25.3s	remaining: 3.84s
257:	learn: 0.0029227	total: 25.4s	remaining: 3.73s
258:	learn: 0.0029227	total: 25.4s	remaining: 3.63s
259:	learn: 0.0029227	total: 25.5s	remaining: 3.53s
260:	learn: 0.0029227	total: 25.6s	remaining: 3.43s
261:	learn: 0.0029227	total: 25.7s	remaining: 3.33s
262:	learn: 0.0029227	total: 25.7s	remaining: 3.23s
263:	learn: 0.0029227	total: 25.8s	remaining: 3.13s
264:	learn: 0.0029227	total: 25.9s	remaining: 3.02s
265:	learn: 0.0029227	total: 26s	remaining: 2.93s
266:	learn: 0.0029227	total: 26.1s	remaining: 2.83s
267:	learn: 0.0029227	total: 26.2s	remaining: 2.74s
268:	learn: 0.0029227	total: 26.3s	remaining: 2.64s
269:	learn: 0.0029227	total: 26.5s	remaining: 2.55s
270:	learn: 0.0029227	total: 26.6s	remaining: 2.45s
271:	learn: 0.0029227	total: 26.7s	remaining: 2.36s
272:	learn: 0.0029227	total: 26.9s	remaining: 2.26s
273:	learn: 0.0029227	total: 27s	remaining: 2.17s
274:	learn: 0.0029227	total: 27.1s	remaining: 2.07s
275:	learn: 0.0029227	total: 27.3s	remaining: 1.98s
276:	learn: 0.0029227	total: 27.4s	remaining: 1.88s
277:	learn: 0.0029227	total: 27.5s	remaining: 1.78s
278:	learn: 0.0029227	total: 27.7s	remaining: 1.69s
279:	learn: 0.0029227	total: 27.8s	remaining: 1.59s
280:	learn: 0.0029227	total: 27.9s	remaining: 1.49s
281:	learn: 0.0029227	total: 28.1s	remaining: 1.39s
282:	learn: 0.0029227	total: 28.2s	remaining: 1.29s
283:	learn: 0.0029227	total: 28.3s	remaining: 1.2s
284:	learn: 0.0029227	total: 28.5s	remaining: 1.1s
285:	learn: 0.0029227	total: 28.6s	remaining: 1s
286:	learn: 0.0029227	total: 28.8s	remaining: 902ms
287:	learn: 0.0029227	total: 28.9s	remaining: 802ms
288:	learn: 0.0029227	total: 29s	remaining: 703ms
289:	learn: 0.0029227	total: 29.1s	remaining: 603ms
290:	learn: 0.0029227	total: 29.3s	remaining: 503ms
291:	learn: 0.0029227	total: 29.4s	remaining: 403ms
292:	learn: 0.0029227	total: 29.6s	remaining: 303ms
293:	learn: 0.0029227	total: 29.6s	remaining: 202ms
294:	learn: 0.0029227	total: 29.8s	remaining: 101ms
295:	learn: 0.0029227	total: 29.9s	remaining: 0us
[I 2024-12-19 14:55:31,911] Trial 37 finished with value: 80.67361029056296 and parameters: {'learning_rate': 0.0998524368637925, 'max_depth': 6, 'n_estimators': 296, 'scale_pos_weight': 5.024432396376165}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.78
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.89
 - F1-Score_Train: 99.89
 - Precision_Test: 34.38
 - Recall_Test: 86.51
 - AUPRC_Test: 81.21
 - Accuracy_Test: 99.70
 - F1-Score_Test: 49.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 296
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.02
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 80.6736

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4808432	total: 86.7ms	remaining: 25.9s
1:	learn: 0.3090605	total: 180ms	remaining: 26.8s
2:	learn: 0.2180256	total: 272ms	remaining: 27s
3:	learn: 0.1648453	total: 424ms	remaining: 31.4s
4:	learn: 0.1225896	total: 517ms	remaining: 30.5s
5:	learn: 0.0974431	total: 611ms	remaining: 29.9s
6:	learn: 0.0844820	total: 725ms	remaining: 30.3s
7:	learn: 0.0738980	total: 815ms	remaining: 29.7s
8:	learn: 0.0676328	total: 899ms	remaining: 29.1s
9:	learn: 0.0597389	total: 1.01s	remaining: 29.3s
10:	learn: 0.0532070	total: 1.11s	remaining: 29.1s
11:	learn: 0.0484773	total: 1.2s	remaining: 28.8s
12:	learn: 0.0459667	total: 1.3s	remaining: 28.8s
13:	learn: 0.0435163	total: 1.4s	remaining: 28.7s
14:	learn: 0.0406822	total: 1.5s	remaining: 28.6s
15:	learn: 0.0384989	total: 1.61s	remaining: 28.6s
16:	learn: 0.0368830	total: 1.7s	remaining: 28.2s
17:	learn: 0.0351363	total: 1.81s	remaining: 28.4s
18:	learn: 0.0333861	total: 1.9s	remaining: 28.2s
19:	learn: 0.0315530	total: 1.99s	remaining: 27.9s
20:	learn: 0.0298292	total: 2.1s	remaining: 27.9s
21:	learn: 0.0287633	total: 2.19s	remaining: 27.7s
22:	learn: 0.0278860	total: 2.27s	remaining: 27.4s
23:	learn: 0.0266528	total: 2.39s	remaining: 27.5s
24:	learn: 0.0257815	total: 2.48s	remaining: 27.3s
25:	learn: 0.0246219	total: 2.57s	remaining: 27.1s
26:	learn: 0.0234016	total: 2.69s	remaining: 27.2s
27:	learn: 0.0227504	total: 2.78s	remaining: 27s
28:	learn: 0.0219221	total: 2.86s	remaining: 26.8s
29:	learn: 0.0210873	total: 2.97s	remaining: 26.7s
30:	learn: 0.0203278	total: 3.05s	remaining: 26.5s
31:	learn: 0.0193301	total: 3.14s	remaining: 26.3s
32:	learn: 0.0185055	total: 3.25s	remaining: 26.3s
33:	learn: 0.0177487	total: 3.34s	remaining: 26.2s
34:	learn: 0.0172118	total: 3.43s	remaining: 26s
35:	learn: 0.0165148	total: 3.56s	remaining: 26.1s
36:	learn: 0.0159819	total: 3.65s	remaining: 25.9s
37:	learn: 0.0153251	total: 3.74s	remaining: 25.8s
38:	learn: 0.0147182	total: 3.85s	remaining: 25.8s
39:	learn: 0.0141678	total: 3.95s	remaining: 25.6s
40:	learn: 0.0138757	total: 4.03s	remaining: 25.4s
41:	learn: 0.0134401	total: 4.13s	remaining: 25.4s
42:	learn: 0.0131722	total: 4.21s	remaining: 25.2s
43:	learn: 0.0124979	total: 4.31s	remaining: 25.1s
44:	learn: 0.0121668	total: 4.41s	remaining: 25s
45:	learn: 0.0119384	total: 4.5s	remaining: 24.8s
46:	learn: 0.0115688	total: 4.6s	remaining: 24.7s
47:	learn: 0.0112433	total: 4.71s	remaining: 24.7s
48:	learn: 0.0109673	total: 4.79s	remaining: 24.6s
49:	learn: 0.0106578	total: 4.88s	remaining: 24.4s
50:	learn: 0.0103089	total: 5.03s	remaining: 24.5s
51:	learn: 0.0100840	total: 5.11s	remaining: 24.4s
52:	learn: 0.0098763	total: 5.19s	remaining: 24.2s
53:	learn: 0.0097348	total: 5.3s	remaining: 24.1s
54:	learn: 0.0096007	total: 5.38s	remaining: 24s
55:	learn: 0.0093634	total: 5.47s	remaining: 23.8s
56:	learn: 0.0092089	total: 5.59s	remaining: 23.8s
57:	learn: 0.0089190	total: 5.68s	remaining: 23.7s
58:	learn: 0.0087374	total: 5.77s	remaining: 23.6s
59:	learn: 0.0085701	total: 5.87s	remaining: 23.5s
60:	learn: 0.0082837	total: 5.96s	remaining: 23.3s
61:	learn: 0.0081711	total: 6.04s	remaining: 23.2s
62:	learn: 0.0080568	total: 6.16s	remaining: 23.2s
63:	learn: 0.0078698	total: 6.25s	remaining: 23s
64:	learn: 0.0077462	total: 6.33s	remaining: 22.9s
65:	learn: 0.0075180	total: 6.45s	remaining: 22.9s
66:	learn: 0.0073352	total: 6.54s	remaining: 22.7s
67:	learn: 0.0071505	total: 6.64s	remaining: 22.7s
68:	learn: 0.0070755	total: 6.75s	remaining: 22.6s
69:	learn: 0.0069537	total: 6.83s	remaining: 22.5s
70:	learn: 0.0068255	total: 6.91s	remaining: 22.3s
71:	learn: 0.0067097	total: 7.02s	remaining: 22.2s
72:	learn: 0.0066451	total: 7.09s	remaining: 22.1s
73:	learn: 0.0064232	total: 7.18s	remaining: 21.9s
74:	learn: 0.0063363	total: 7.28s	remaining: 21.9s
75:	learn: 0.0062418	total: 7.38s	remaining: 21.7s
76:	learn: 0.0061155	total: 7.46s	remaining: 21.6s
77:	learn: 0.0060364	total: 7.57s	remaining: 21.5s
78:	learn: 0.0059354	total: 7.66s	remaining: 21.4s
79:	learn: 0.0058499	total: 7.76s	remaining: 21.3s
80:	learn: 0.0057729	total: 7.86s	remaining: 21.3s
81:	learn: 0.0057198	total: 7.94s	remaining: 21.1s
82:	learn: 0.0055722	total: 8.02s	remaining: 21s
83:	learn: 0.0054705	total: 8.13s	remaining: 20.9s
84:	learn: 0.0054019	total: 8.22s	remaining: 20.8s
85:	learn: 0.0052963	total: 8.31s	remaining: 20.7s
86:	learn: 0.0051890	total: 8.42s	remaining: 20.6s
87:	learn: 0.0051207	total: 8.5s	remaining: 20.5s
88:	learn: 0.0050472	total: 8.59s	remaining: 20.4s
89:	learn: 0.0049302	total: 8.74s	remaining: 20.4s
90:	learn: 0.0048018	total: 8.84s	remaining: 20.3s
91:	learn: 0.0047592	total: 8.94s	remaining: 20.2s
92:	learn: 0.0046612	total: 9.1s	remaining: 20.3s
93:	learn: 0.0046202	total: 9.24s	remaining: 20.3s
94:	learn: 0.0044691	total: 9.45s	remaining: 20.4s
95:	learn: 0.0043941	total: 9.62s	remaining: 20.5s
96:	learn: 0.0043376	total: 9.8s	remaining: 20.5s
97:	learn: 0.0042650	total: 9.96s	remaining: 20.5s
98:	learn: 0.0042355	total: 10.1s	remaining: 20.6s
99:	learn: 0.0041289	total: 10.3s	remaining: 20.7s
100:	learn: 0.0040239	total: 10.5s	remaining: 20.7s
101:	learn: 0.0039657	total: 10.7s	remaining: 20.7s
102:	learn: 0.0039657	total: 10.8s	remaining: 20.6s
103:	learn: 0.0039290	total: 10.9s	remaining: 20.6s
104:	learn: 0.0039007	total: 11.1s	remaining: 20.6s
105:	learn: 0.0038218	total: 11.3s	remaining: 20.6s
106:	learn: 0.0038218	total: 11.4s	remaining: 20.6s
107:	learn: 0.0037777	total: 11.6s	remaining: 20.6s
108:	learn: 0.0037777	total: 11.7s	remaining: 20.5s
109:	learn: 0.0037480	total: 11.9s	remaining: 20.5s
110:	learn: 0.0037276	total: 12s	remaining: 20.5s
111:	learn: 0.0036766	total: 12.2s	remaining: 20.5s
112:	learn: 0.0036176	total: 12.4s	remaining: 20.5s
113:	learn: 0.0036176	total: 12.5s	remaining: 20.5s
114:	learn: 0.0036024	total: 12.7s	remaining: 20.4s
115:	learn: 0.0035614	total: 12.8s	remaining: 20.4s
116:	learn: 0.0035614	total: 13s	remaining: 20.3s
117:	learn: 0.0035339	total: 13.1s	remaining: 20.3s
118:	learn: 0.0034658	total: 13.3s	remaining: 20.2s
119:	learn: 0.0034658	total: 13.4s	remaining: 20.2s
120:	learn: 0.0034323	total: 13.6s	remaining: 20.1s
121:	learn: 0.0034323	total: 13.8s	remaining: 20.1s
122:	learn: 0.0033875	total: 13.9s	remaining: 20.1s
123:	learn: 0.0033605	total: 14.1s	remaining: 20s
124:	learn: 0.0033093	total: 14.3s	remaining: 20s
125:	learn: 0.0032841	total: 14.5s	remaining: 20s
126:	learn: 0.0032841	total: 14.6s	remaining: 19.9s
127:	learn: 0.0032570	total: 14.8s	remaining: 19.9s
128:	learn: 0.0032570	total: 14.9s	remaining: 19.7s
129:	learn: 0.0032311	total: 15s	remaining: 19.6s
130:	learn: 0.0032127	total: 15.1s	remaining: 19.4s
131:	learn: 0.0032127	total: 15.2s	remaining: 19.3s
132:	learn: 0.0032127	total: 15.2s	remaining: 19.1s
133:	learn: 0.0031747	total: 15.3s	remaining: 19s
134:	learn: 0.0031516	total: 15.4s	remaining: 18.9s
135:	learn: 0.0031516	total: 15.5s	remaining: 18.7s
136:	learn: 0.0031516	total: 15.6s	remaining: 18.5s
137:	learn: 0.0031516	total: 15.6s	remaining: 18.4s
138:	learn: 0.0031516	total: 15.7s	remaining: 18.2s
139:	learn: 0.0031305	total: 15.8s	remaining: 18.1s
140:	learn: 0.0031305	total: 15.9s	remaining: 17.9s
141:	learn: 0.0031305	total: 16s	remaining: 17.8s
142:	learn: 0.0031305	total: 16.1s	remaining: 17.6s
143:	learn: 0.0031305	total: 16.1s	remaining: 17.5s
144:	learn: 0.0031305	total: 16.2s	remaining: 17.3s
145:	learn: 0.0031305	total: 16.3s	remaining: 17.2s
146:	learn: 0.0031305	total: 16.4s	remaining: 17s
147:	learn: 0.0031305	total: 16.4s	remaining: 16.9s
148:	learn: 0.0031305	total: 16.5s	remaining: 16.7s
149:	learn: 0.0031305	total: 16.6s	remaining: 16.6s
150:	learn: 0.0031305	total: 16.7s	remaining: 16.5s
151:	learn: 0.0031305	total: 16.7s	remaining: 16.3s
152:	learn: 0.0031305	total: 16.8s	remaining: 16.2s
153:	learn: 0.0031305	total: 16.9s	remaining: 16s
154:	learn: 0.0031305	total: 17s	remaining: 15.9s
155:	learn: 0.0031305	total: 17.1s	remaining: 15.7s
156:	learn: 0.0030938	total: 17.2s	remaining: 15.6s
157:	learn: 0.0030717	total: 17.2s	remaining: 15.5s
158:	learn: 0.0030717	total: 17.3s	remaining: 15.4s
159:	learn: 0.0030717	total: 17.4s	remaining: 15.2s
160:	learn: 0.0030717	total: 17.5s	remaining: 15.1s
161:	learn: 0.0030717	total: 17.6s	remaining: 15s
162:	learn: 0.0030717	total: 17.7s	remaining: 14.8s
163:	learn: 0.0030717	total: 17.7s	remaining: 14.7s
164:	learn: 0.0030717	total: 17.8s	remaining: 14.6s
165:	learn: 0.0030717	total: 17.9s	remaining: 14.4s
166:	learn: 0.0030717	total: 18s	remaining: 14.3s
167:	learn: 0.0030717	total: 18s	remaining: 14.2s
168:	learn: 0.0030717	total: 18.1s	remaining: 14.1s
169:	learn: 0.0030717	total: 18.2s	remaining: 13.9s
170:	learn: 0.0030717	total: 18.3s	remaining: 13.8s
171:	learn: 0.0030717	total: 18.4s	remaining: 13.7s
172:	learn: 0.0030717	total: 18.4s	remaining: 13.5s
173:	learn: 0.0030717	total: 18.5s	remaining: 13.4s
174:	learn: 0.0030717	total: 18.6s	remaining: 13.3s
175:	learn: 0.0030717	total: 18.7s	remaining: 13.2s
176:	learn: 0.0030717	total: 18.8s	remaining: 13s
177:	learn: 0.0030717	total: 18.8s	remaining: 12.9s
178:	learn: 0.0030717	total: 18.9s	remaining: 12.8s
179:	learn: 0.0030717	total: 19s	remaining: 12.7s
180:	learn: 0.0030717	total: 19.1s	remaining: 12.5s
181:	learn: 0.0030717	total: 19.1s	remaining: 12.4s
182:	learn: 0.0030717	total: 19.2s	remaining: 12.3s
183:	learn: 0.0030717	total: 19.3s	remaining: 12.2s
184:	learn: 0.0030717	total: 19.4s	remaining: 12.1s
185:	learn: 0.0030717	total: 19.5s	remaining: 11.9s
186:	learn: 0.0030717	total: 19.6s	remaining: 11.8s
187:	learn: 0.0030717	total: 19.6s	remaining: 11.7s
188:	learn: 0.0030717	total: 19.7s	remaining: 11.6s
189:	learn: 0.0030717	total: 19.8s	remaining: 11.5s
190:	learn: 0.0030717	total: 19.9s	remaining: 11.3s
191:	learn: 0.0030717	total: 19.9s	remaining: 11.2s
192:	learn: 0.0030717	total: 20s	remaining: 11.1s
193:	learn: 0.0030717	total: 20.1s	remaining: 11s
194:	learn: 0.0030717	total: 20.2s	remaining: 10.9s
195:	learn: 0.0030717	total: 20.3s	remaining: 10.8s
196:	learn: 0.0030717	total: 20.3s	remaining: 10.6s
197:	learn: 0.0030717	total: 20.4s	remaining: 10.5s
198:	learn: 0.0030717	total: 20.5s	remaining: 10.4s
199:	learn: 0.0030717	total: 20.6s	remaining: 10.3s
200:	learn: 0.0030717	total: 20.7s	remaining: 10.2s
201:	learn: 0.0030717	total: 20.7s	remaining: 10.1s
202:	learn: 0.0030717	total: 20.8s	remaining: 9.94s
203:	learn: 0.0030717	total: 20.9s	remaining: 9.84s
204:	learn: 0.0030717	total: 21s	remaining: 9.72s
205:	learn: 0.0030717	total: 21s	remaining: 9.6s
206:	learn: 0.0030717	total: 21.1s	remaining: 9.48s
207:	learn: 0.0030717	total: 21.2s	remaining: 9.4s
208:	learn: 0.0030717	total: 21.3s	remaining: 9.28s
209:	learn: 0.0030717	total: 21.4s	remaining: 9.18s
210:	learn: 0.0030717	total: 21.5s	remaining: 9.06s
211:	learn: 0.0030717	total: 21.6s	remaining: 8.95s
212:	learn: 0.0030717	total: 21.6s	remaining: 8.84s
213:	learn: 0.0030717	total: 21.7s	remaining: 8.73s
214:	learn: 0.0030717	total: 21.8s	remaining: 8.62s
215:	learn: 0.0030717	total: 21.9s	remaining: 8.51s
216:	learn: 0.0030717	total: 21.9s	remaining: 8.39s
217:	learn: 0.0030717	total: 22s	remaining: 8.29s
218:	learn: 0.0030717	total: 22.1s	remaining: 8.18s
219:	learn: 0.0030717	total: 22.2s	remaining: 8.06s
220:	learn: 0.0030717	total: 22.3s	remaining: 7.96s
221:	learn: 0.0030717	total: 22.4s	remaining: 7.86s
222:	learn: 0.0030717	total: 22.4s	remaining: 7.74s
223:	learn: 0.0030717	total: 22.5s	remaining: 7.64s
224:	learn: 0.0030717	total: 22.6s	remaining: 7.53s
225:	learn: 0.0030717	total: 22.7s	remaining: 7.42s
226:	learn: 0.0030717	total: 22.8s	remaining: 7.32s
227:	learn: 0.0030717	total: 22.8s	remaining: 7.21s
228:	learn: 0.0030717	total: 22.9s	remaining: 7.1s
229:	learn: 0.0030717	total: 23s	remaining: 7s
230:	learn: 0.0030717	total: 23.1s	remaining: 6.89s
231:	learn: 0.0030717	total: 23.1s	remaining: 6.78s
232:	learn: 0.0030717	total: 23.2s	remaining: 6.68s
233:	learn: 0.0030717	total: 23.3s	remaining: 6.58s
234:	learn: 0.0030717	total: 23.4s	remaining: 6.47s
235:	learn: 0.0030717	total: 23.5s	remaining: 6.37s
236:	learn: 0.0030717	total: 23.6s	remaining: 6.26s
237:	learn: 0.0030717	total: 23.6s	remaining: 6.16s
238:	learn: 0.0030717	total: 23.7s	remaining: 6.05s
239:	learn: 0.0030717	total: 23.8s	remaining: 5.95s
240:	learn: 0.0030717	total: 23.9s	remaining: 5.85s
241:	learn: 0.0030717	total: 24s	remaining: 5.75s
242:	learn: 0.0030717	total: 24.1s	remaining: 5.64s
243:	learn: 0.0030717	total: 24.1s	remaining: 5.54s
244:	learn: 0.0030717	total: 24.2s	remaining: 5.43s
245:	learn: 0.0030717	total: 24.3s	remaining: 5.33s
246:	learn: 0.0030717	total: 24.4s	remaining: 5.23s
247:	learn: 0.0030717	total: 24.5s	remaining: 5.13s
248:	learn: 0.0030717	total: 24.5s	remaining: 5.03s
249:	learn: 0.0030717	total: 24.6s	remaining: 4.92s
250:	learn: 0.0030717	total: 24.7s	remaining: 4.82s
251:	learn: 0.0030717	total: 24.9s	remaining: 4.74s
252:	learn: 0.0030717	total: 25s	remaining: 4.64s
253:	learn: 0.0030717	total: 25.1s	remaining: 4.55s
254:	learn: 0.0030717	total: 25.3s	remaining: 4.46s
255:	learn: 0.0030717	total: 25.4s	remaining: 4.37s
256:	learn: 0.0030717	total: 25.6s	remaining: 4.28s
257:	learn: 0.0030717	total: 25.7s	remaining: 4.18s
258:	learn: 0.0030717	total: 25.9s	remaining: 4.09s
259:	learn: 0.0030717	total: 26s	remaining: 4s
260:	learn: 0.0030717	total: 26.1s	remaining: 3.91s
261:	learn: 0.0030717	total: 26.3s	remaining: 3.81s
262:	learn: 0.0030717	total: 26.4s	remaining: 3.72s
263:	learn: 0.0030717	total: 26.6s	remaining: 3.62s
264:	learn: 0.0030717	total: 26.7s	remaining: 3.52s
265:	learn: 0.0030717	total: 26.8s	remaining: 3.43s
266:	learn: 0.0030717	total: 27s	remaining: 3.33s
267:	learn: 0.0030717	total: 27.1s	remaining: 3.23s
268:	learn: 0.0030717	total: 27.2s	remaining: 3.14s
269:	learn: 0.0030717	total: 27.4s	remaining: 3.04s
270:	learn: 0.0030717	total: 27.5s	remaining: 2.94s
271:	learn: 0.0030717	total: 27.7s	remaining: 2.85s
272:	learn: 0.0030717	total: 27.8s	remaining: 2.75s
273:	learn: 0.0030717	total: 27.9s	remaining: 2.65s
274:	learn: 0.0030717	total: 28.1s	remaining: 2.55s
275:	learn: 0.0030717	total: 28.2s	remaining: 2.46s
276:	learn: 0.0030717	total: 28.4s	remaining: 2.36s
277:	learn: 0.0030717	total: 28.5s	remaining: 2.26s
278:	learn: 0.0030717	total: 28.7s	remaining: 2.16s
279:	learn: 0.0030717	total: 28.8s	remaining: 2.06s
280:	learn: 0.0030717	total: 29s	remaining: 1.96s
281:	learn: 0.0030717	total: 29.1s	remaining: 1.86s
282:	learn: 0.0030717	total: 29.3s	remaining: 1.76s
283:	learn: 0.0030717	total: 29.4s	remaining: 1.66s
284:	learn: 0.0030717	total: 29.6s	remaining: 1.55s
285:	learn: 0.0030717	total: 29.7s	remaining: 1.45s
286:	learn: 0.0030717	total: 29.8s	remaining: 1.35s
287:	learn: 0.0030717	total: 30s	remaining: 1.25s
288:	learn: 0.0030717	total: 30.1s	remaining: 1.15s
289:	learn: 0.0030717	total: 30.2s	remaining: 1.04s
290:	learn: 0.0030717	total: 30.3s	remaining: 938ms
291:	learn: 0.0030717	total: 30.4s	remaining: 833ms
292:	learn: 0.0030717	total: 30.5s	remaining: 728ms
293:	learn: 0.0030717	total: 30.6s	remaining: 624ms
294:	learn: 0.0030717	total: 30.7s	remaining: 520ms
295:	learn: 0.0030717	total: 30.7s	remaining: 415ms
296:	learn: 0.0030717	total: 30.8s	remaining: 311ms
297:	learn: 0.0030717	total: 30.9s	remaining: 207ms
298:	learn: 0.0030717	total: 31s	remaining: 104ms
299:	learn: 0.0030717	total: 31.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.70
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 26.91
 - Recall_Test: 86.51
 - AUPRC_Test: 78.49
 - Accuracy_Test: 99.58
 - F1-Score_Test: 41.05
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 300
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.06
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4757795	total: 83.5ms	remaining: 25s
1:	learn: 0.3349661	total: 174ms	remaining: 25.9s
2:	learn: 0.2510108	total: 265ms	remaining: 26.2s
3:	learn: 0.1958661	total: 380ms	remaining: 28.1s
4:	learn: 0.1533913	total: 482ms	remaining: 28.4s
5:	learn: 0.1303593	total: 591ms	remaining: 29s
6:	learn: 0.1144222	total: 703ms	remaining: 29.4s
7:	learn: 0.1036227	total: 790ms	remaining: 28.8s
8:	learn: 0.0936241	total: 902ms	remaining: 29.2s
9:	learn: 0.0860994	total: 992ms	remaining: 28.8s
10:	learn: 0.0788624	total: 1.08s	remaining: 28.4s
11:	learn: 0.0740346	total: 1.22s	remaining: 29.2s
12:	learn: 0.0689576	total: 1.32s	remaining: 29.2s
13:	learn: 0.0644835	total: 1.42s	remaining: 29.1s
14:	learn: 0.0615823	total: 1.54s	remaining: 29.2s
15:	learn: 0.0578503	total: 1.65s	remaining: 29.3s
16:	learn: 0.0541674	total: 1.75s	remaining: 29.2s
17:	learn: 0.0508340	total: 1.85s	remaining: 28.9s
18:	learn: 0.0475742	total: 1.96s	remaining: 29.1s
19:	learn: 0.0450815	total: 2.07s	remaining: 28.9s
20:	learn: 0.0423365	total: 2.16s	remaining: 28.8s
21:	learn: 0.0409036	total: 2.27s	remaining: 28.7s
22:	learn: 0.0389811	total: 2.37s	remaining: 28.5s
23:	learn: 0.0371014	total: 2.46s	remaining: 28.3s
24:	learn: 0.0359535	total: 2.59s	remaining: 28.5s
25:	learn: 0.0349000	total: 2.67s	remaining: 28.2s
26:	learn: 0.0334199	total: 2.77s	remaining: 28s
27:	learn: 0.0317593	total: 2.88s	remaining: 28s
28:	learn: 0.0305570	total: 2.97s	remaining: 27.7s
29:	learn: 0.0290616	total: 3.06s	remaining: 27.5s
30:	learn: 0.0283538	total: 3.17s	remaining: 27.5s
31:	learn: 0.0275389	total: 3.26s	remaining: 27.3s
32:	learn: 0.0262014	total: 3.35s	remaining: 27.1s
33:	learn: 0.0250441	total: 3.47s	remaining: 27.2s
34:	learn: 0.0243661	total: 3.55s	remaining: 26.9s
35:	learn: 0.0238662	total: 3.65s	remaining: 26.7s
36:	learn: 0.0230087	total: 3.76s	remaining: 26.7s
37:	learn: 0.0223235	total: 3.84s	remaining: 26.5s
38:	learn: 0.0218290	total: 3.94s	remaining: 26.4s
39:	learn: 0.0213754	total: 4.05s	remaining: 26.3s
40:	learn: 0.0207287	total: 4.14s	remaining: 26.2s
41:	learn: 0.0202831	total: 4.23s	remaining: 26s
42:	learn: 0.0196227	total: 4.34s	remaining: 25.9s
43:	learn: 0.0189927	total: 4.44s	remaining: 25.8s
44:	learn: 0.0186200	total: 4.53s	remaining: 25.7s
45:	learn: 0.0181485	total: 4.65s	remaining: 25.7s
46:	learn: 0.0178183	total: 4.73s	remaining: 25.5s
47:	learn: 0.0173107	total: 4.82s	remaining: 25.3s
48:	learn: 0.0170601	total: 4.93s	remaining: 25.3s
49:	learn: 0.0165501	total: 5.02s	remaining: 25.1s
50:	learn: 0.0160809	total: 5.1s	remaining: 24.9s
51:	learn: 0.0156872	total: 5.21s	remaining: 24.8s
52:	learn: 0.0152556	total: 5.3s	remaining: 24.7s
53:	learn: 0.0149420	total: 5.38s	remaining: 24.5s
54:	learn: 0.0145213	total: 5.53s	remaining: 24.6s
55:	learn: 0.0142533	total: 5.61s	remaining: 24.5s
56:	learn: 0.0140783	total: 5.72s	remaining: 24.4s
57:	learn: 0.0136385	total: 5.83s	remaining: 24.3s
58:	learn: 0.0133647	total: 5.93s	remaining: 24.2s
59:	learn: 0.0129612	total: 6.02s	remaining: 24.1s
60:	learn: 0.0127226	total: 6.13s	remaining: 24s
61:	learn: 0.0125067	total: 6.22s	remaining: 23.9s
62:	learn: 0.0121298	total: 6.32s	remaining: 23.8s
63:	learn: 0.0118618	total: 6.43s	remaining: 23.7s
64:	learn: 0.0116388	total: 6.52s	remaining: 23.6s
65:	learn: 0.0114430	total: 6.61s	remaining: 23.4s
66:	learn: 0.0111822	total: 6.74s	remaining: 23.4s
67:	learn: 0.0110144	total: 6.82s	remaining: 23.3s
68:	learn: 0.0108490	total: 6.91s	remaining: 23.1s
69:	learn: 0.0106024	total: 7.06s	remaining: 23.2s
70:	learn: 0.0104853	total: 7.2s	remaining: 23.2s
71:	learn: 0.0104127	total: 7.37s	remaining: 23.3s
72:	learn: 0.0102097	total: 7.54s	remaining: 23.4s
73:	learn: 0.0100459	total: 7.69s	remaining: 23.5s
74:	learn: 0.0099245	total: 7.86s	remaining: 23.6s
75:	learn: 0.0098019	total: 8.04s	remaining: 23.7s
76:	learn: 0.0095576	total: 8.23s	remaining: 23.8s
77:	learn: 0.0093679	total: 8.41s	remaining: 23.9s
78:	learn: 0.0091816	total: 8.58s	remaining: 24s
79:	learn: 0.0089947	total: 8.74s	remaining: 24.1s
80:	learn: 0.0088658	total: 8.92s	remaining: 24.1s
81:	learn: 0.0087198	total: 9.11s	remaining: 24.2s
82:	learn: 0.0086185	total: 9.27s	remaining: 24.2s
83:	learn: 0.0084912	total: 9.44s	remaining: 24.3s
84:	learn: 0.0083800	total: 9.61s	remaining: 24.3s
85:	learn: 0.0081634	total: 9.8s	remaining: 24.4s
86:	learn: 0.0080388	total: 10s	remaining: 24.5s
87:	learn: 0.0079843	total: 10.2s	remaining: 24.5s
88:	learn: 0.0079089	total: 10.3s	remaining: 24.5s
89:	learn: 0.0078103	total: 10.5s	remaining: 24.6s
90:	learn: 0.0076996	total: 10.7s	remaining: 24.6s
91:	learn: 0.0075212	total: 10.9s	remaining: 24.6s
92:	learn: 0.0074092	total: 11s	remaining: 24.6s
93:	learn: 0.0073354	total: 11.2s	remaining: 24.5s
94:	learn: 0.0072146	total: 11.4s	remaining: 24.5s
95:	learn: 0.0071047	total: 11.5s	remaining: 24.5s
96:	learn: 0.0070627	total: 11.7s	remaining: 24.5s
97:	learn: 0.0069333	total: 11.9s	remaining: 24.5s
98:	learn: 0.0067982	total: 12.1s	remaining: 24.5s
99:	learn: 0.0066834	total: 12.2s	remaining: 24.5s
100:	learn: 0.0066122	total: 12.4s	remaining: 24.5s
101:	learn: 0.0066122	total: 12.6s	remaining: 24.4s
102:	learn: 0.0065504	total: 12.7s	remaining: 24.3s
103:	learn: 0.0064233	total: 12.8s	remaining: 24.2s
104:	learn: 0.0062642	total: 12.9s	remaining: 24s
105:	learn: 0.0061711	total: 13s	remaining: 23.9s
106:	learn: 0.0061265	total: 13.1s	remaining: 23.7s
107:	learn: 0.0060400	total: 13.2s	remaining: 23.5s
108:	learn: 0.0059819	total: 13.3s	remaining: 23.4s
109:	learn: 0.0059234	total: 13.4s	remaining: 23.2s
110:	learn: 0.0058769	total: 13.5s	remaining: 23s
111:	learn: 0.0058104	total: 13.6s	remaining: 22.8s
112:	learn: 0.0057208	total: 13.7s	remaining: 22.7s
113:	learn: 0.0056377	total: 13.8s	remaining: 22.5s
114:	learn: 0.0055804	total: 13.9s	remaining: 22.4s
115:	learn: 0.0054755	total: 14s	remaining: 22.2s
116:	learn: 0.0054227	total: 14.1s	remaining: 22s
117:	learn: 0.0053210	total: 14.2s	remaining: 21.9s
118:	learn: 0.0052469	total: 14.3s	remaining: 21.7s
119:	learn: 0.0052078	total: 14.4s	remaining: 21.5s
120:	learn: 0.0051029	total: 14.5s	remaining: 21.4s
121:	learn: 0.0050751	total: 14.5s	remaining: 21.2s
122:	learn: 0.0050158	total: 14.6s	remaining: 21.1s
123:	learn: 0.0049531	total: 14.8s	remaining: 21s
124:	learn: 0.0049205	total: 14.9s	remaining: 20.8s
125:	learn: 0.0048919	total: 14.9s	remaining: 20.6s
126:	learn: 0.0048152	total: 15.1s	remaining: 20.5s
127:	learn: 0.0047941	total: 15.2s	remaining: 20.4s
128:	learn: 0.0047386	total: 15.2s	remaining: 20.2s
129:	learn: 0.0047386	total: 15.3s	remaining: 20s
130:	learn: 0.0047062	total: 15.4s	remaining: 19.9s
131:	learn: 0.0047061	total: 15.5s	remaining: 19.7s
132:	learn: 0.0046225	total: 15.6s	remaining: 19.6s
133:	learn: 0.0046131	total: 15.7s	remaining: 19.4s
134:	learn: 0.0045918	total: 15.7s	remaining: 19.2s
135:	learn: 0.0045919	total: 15.8s	remaining: 19.1s
136:	learn: 0.0045637	total: 15.9s	remaining: 18.9s
137:	learn: 0.0045243	total: 16s	remaining: 18.8s
138:	learn: 0.0045177	total: 16.1s	remaining: 18.6s
139:	learn: 0.0044452	total: 16.2s	remaining: 18.5s
140:	learn: 0.0043558	total: 16.3s	remaining: 18.4s
141:	learn: 0.0043225	total: 16.4s	remaining: 18.2s
142:	learn: 0.0043051	total: 16.5s	remaining: 18.1s
143:	learn: 0.0042580	total: 16.6s	remaining: 17.9s
144:	learn: 0.0041920	total: 16.7s	remaining: 17.8s
145:	learn: 0.0041562	total: 16.8s	remaining: 17.7s
146:	learn: 0.0040735	total: 16.8s	remaining: 17.5s
147:	learn: 0.0039915	total: 17s	remaining: 17.4s
148:	learn: 0.0039335	total: 17s	remaining: 17.3s
149:	learn: 0.0038628	total: 17.1s	remaining: 17.1s
150:	learn: 0.0038391	total: 17.2s	remaining: 17s
151:	learn: 0.0038035	total: 17.3s	remaining: 16.9s
152:	learn: 0.0037747	total: 17.4s	remaining: 16.7s
153:	learn: 0.0037119	total: 17.5s	remaining: 16.6s
154:	learn: 0.0036844	total: 17.6s	remaining: 16.5s
155:	learn: 0.0036542	total: 17.7s	remaining: 16.3s
156:	learn: 0.0035973	total: 17.8s	remaining: 16.2s
157:	learn: 0.0035743	total: 17.9s	remaining: 16.1s
158:	learn: 0.0035195	total: 18s	remaining: 16s
159:	learn: 0.0034540	total: 18.1s	remaining: 15.8s
160:	learn: 0.0033972	total: 18.2s	remaining: 15.7s
161:	learn: 0.0033568	total: 18.3s	remaining: 15.6s
162:	learn: 0.0033568	total: 18.4s	remaining: 15.4s
163:	learn: 0.0033014	total: 18.5s	remaining: 15.3s
164:	learn: 0.0033014	total: 18.5s	remaining: 15.2s
165:	learn: 0.0033014	total: 18.6s	remaining: 15s
166:	learn: 0.0033013	total: 18.7s	remaining: 14.9s
167:	learn: 0.0032851	total: 18.8s	remaining: 14.7s
168:	learn: 0.0032732	total: 18.9s	remaining: 14.6s
169:	learn: 0.0032710	total: 18.9s	remaining: 14.5s
170:	learn: 0.0032472	total: 19s	remaining: 14.4s
171:	learn: 0.0032378	total: 19.1s	remaining: 14.2s
172:	learn: 0.0032378	total: 19.2s	remaining: 14.1s
173:	learn: 0.0032378	total: 19.3s	remaining: 13.9s
174:	learn: 0.0032205	total: 19.4s	remaining: 13.8s
175:	learn: 0.0031728	total: 19.4s	remaining: 13.7s
176:	learn: 0.0031727	total: 19.5s	remaining: 13.6s
177:	learn: 0.0031504	total: 19.6s	remaining: 13.5s
178:	learn: 0.0031192	total: 19.7s	remaining: 13.3s
179:	learn: 0.0030703	total: 19.8s	remaining: 13.2s
180:	learn: 0.0030424	total: 19.9s	remaining: 13.1s
181:	learn: 0.0030424	total: 20s	remaining: 13s
182:	learn: 0.0030424	total: 20.1s	remaining: 12.8s
183:	learn: 0.0030424	total: 20.2s	remaining: 12.7s
184:	learn: 0.0030219	total: 20.3s	remaining: 12.6s
185:	learn: 0.0030218	total: 20.3s	remaining: 12.5s
186:	learn: 0.0030026	total: 20.4s	remaining: 12.4s
187:	learn: 0.0029830	total: 20.6s	remaining: 12.3s
188:	learn: 0.0029830	total: 20.6s	remaining: 12.1s
189:	learn: 0.0029830	total: 20.7s	remaining: 12s
190:	learn: 0.0029830	total: 20.8s	remaining: 11.9s
191:	learn: 0.0029830	total: 20.9s	remaining: 11.7s
192:	learn: 0.0029830	total: 20.9s	remaining: 11.6s
193:	learn: 0.0029830	total: 21s	remaining: 11.5s
194:	learn: 0.0029830	total: 21.1s	remaining: 11.3s
195:	learn: 0.0029830	total: 21.2s	remaining: 11.2s
196:	learn: 0.0029830	total: 21.2s	remaining: 11.1s
197:	learn: 0.0029830	total: 21.3s	remaining: 11s
198:	learn: 0.0029830	total: 21.4s	remaining: 10.9s
199:	learn: 0.0029830	total: 21.5s	remaining: 10.7s
200:	learn: 0.0029830	total: 21.5s	remaining: 10.6s
201:	learn: 0.0029830	total: 21.6s	remaining: 10.5s
202:	learn: 0.0029830	total: 21.7s	remaining: 10.4s
203:	learn: 0.0029830	total: 21.8s	remaining: 10.2s
204:	learn: 0.0029830	total: 21.8s	remaining: 10.1s
205:	learn: 0.0029830	total: 21.9s	remaining: 9.99s
206:	learn: 0.0029830	total: 22s	remaining: 9.87s
207:	learn: 0.0029758	total: 22.1s	remaining: 9.76s
208:	learn: 0.0029758	total: 22.1s	remaining: 9.63s
209:	learn: 0.0029758	total: 22.2s	remaining: 9.51s
210:	learn: 0.0029758	total: 22.3s	remaining: 9.41s
211:	learn: 0.0029159	total: 22.4s	remaining: 9.31s
212:	learn: 0.0028883	total: 22.5s	remaining: 9.2s
213:	learn: 0.0028883	total: 22.6s	remaining: 9.07s
214:	learn: 0.0028883	total: 22.7s	remaining: 8.96s
215:	learn: 0.0028742	total: 22.8s	remaining: 8.88s
216:	learn: 0.0028742	total: 23s	remaining: 8.78s
217:	learn: 0.0028742	total: 23.1s	remaining: 8.69s
218:	learn: 0.0028742	total: 23.2s	remaining: 8.59s
219:	learn: 0.0028742	total: 23.4s	remaining: 8.5s
220:	learn: 0.0028742	total: 23.5s	remaining: 8.41s
221:	learn: 0.0028742	total: 23.7s	remaining: 8.32s
222:	learn: 0.0028742	total: 23.8s	remaining: 8.22s
223:	learn: 0.0028742	total: 23.9s	remaining: 8.12s
224:	learn: 0.0028742	total: 24.1s	remaining: 8.03s
225:	learn: 0.0028742	total: 24.2s	remaining: 7.93s
226:	learn: 0.0028742	total: 24.4s	remaining: 7.84s
227:	learn: 0.0028742	total: 24.5s	remaining: 7.74s
228:	learn: 0.0028742	total: 24.6s	remaining: 7.64s
229:	learn: 0.0028742	total: 24.8s	remaining: 7.54s
230:	learn: 0.0028742	total: 24.9s	remaining: 7.45s
231:	learn: 0.0028742	total: 25.1s	remaining: 7.35s
232:	learn: 0.0028742	total: 25.2s	remaining: 7.25s
233:	learn: 0.0028742	total: 25.3s	remaining: 7.14s
234:	learn: 0.0028742	total: 25.5s	remaining: 7.05s
235:	learn: 0.0028742	total: 25.6s	remaining: 6.95s
236:	learn: 0.0028742	total: 25.8s	remaining: 6.85s
237:	learn: 0.0028742	total: 25.9s	remaining: 6.75s
238:	learn: 0.0028742	total: 26.1s	remaining: 6.65s
239:	learn: 0.0028742	total: 26.2s	remaining: 6.55s
240:	learn: 0.0028742	total: 26.3s	remaining: 6.44s
241:	learn: 0.0028742	total: 26.5s	remaining: 6.34s
242:	learn: 0.0028742	total: 26.6s	remaining: 6.23s
243:	learn: 0.0028742	total: 26.7s	remaining: 6.13s
244:	learn: 0.0028742	total: 26.8s	remaining: 6.02s
245:	learn: 0.0028742	total: 27s	remaining: 5.92s
246:	learn: 0.0028742	total: 27.1s	remaining: 5.82s
247:	learn: 0.0028214	total: 27.3s	remaining: 5.72s
248:	learn: 0.0027753	total: 27.5s	remaining: 5.63s
249:	learn: 0.0027753	total: 27.6s	remaining: 5.52s
250:	learn: 0.0027753	total: 27.7s	remaining: 5.42s
251:	learn: 0.0027753	total: 27.9s	remaining: 5.31s
252:	learn: 0.0027753	total: 28s	remaining: 5.21s
253:	learn: 0.0027753	total: 28.2s	remaining: 5.1s
254:	learn: 0.0027753	total: 28.3s	remaining: 5s
255:	learn: 0.0027753	total: 28.4s	remaining: 4.89s
256:	learn: 0.0027753	total: 28.5s	remaining: 4.77s
257:	learn: 0.0027753	total: 28.6s	remaining: 4.65s
258:	learn: 0.0027753	total: 28.7s	remaining: 4.54s
259:	learn: 0.0027753	total: 28.7s	remaining: 4.42s
260:	learn: 0.0027752	total: 28.8s	remaining: 4.31s
261:	learn: 0.0027752	total: 28.9s	remaining: 4.19s
262:	learn: 0.0027752	total: 29s	remaining: 4.08s
263:	learn: 0.0027752	total: 29.1s	remaining: 3.96s
264:	learn: 0.0027752	total: 29.1s	remaining: 3.85s
265:	learn: 0.0027752	total: 29.2s	remaining: 3.73s
266:	learn: 0.0027752	total: 29.3s	remaining: 3.62s
267:	learn: 0.0027752	total: 29.4s	remaining: 3.51s
268:	learn: 0.0027752	total: 29.4s	remaining: 3.39s
269:	learn: 0.0027752	total: 29.5s	remaining: 3.28s
270:	learn: 0.0027752	total: 29.6s	remaining: 3.17s
271:	learn: 0.0027751	total: 29.7s	remaining: 3.06s
272:	learn: 0.0027751	total: 29.8s	remaining: 2.94s
273:	learn: 0.0027751	total: 29.9s	remaining: 2.83s
274:	learn: 0.0027751	total: 29.9s	remaining: 2.72s
275:	learn: 0.0027751	total: 30s	remaining: 2.61s
276:	learn: 0.0027592	total: 30.1s	remaining: 2.5s
277:	learn: 0.0027283	total: 30.2s	remaining: 2.39s
278:	learn: 0.0026910	total: 30.3s	remaining: 2.28s
279:	learn: 0.0026614	total: 30.4s	remaining: 2.17s
280:	learn: 0.0026615	total: 30.5s	remaining: 2.06s
281:	learn: 0.0026614	total: 30.6s	remaining: 1.95s
282:	learn: 0.0026614	total: 30.7s	remaining: 1.84s
283:	learn: 0.0026613	total: 30.8s	remaining: 1.73s
284:	learn: 0.0026614	total: 30.9s	remaining: 1.62s
285:	learn: 0.0026613	total: 30.9s	remaining: 1.51s
286:	learn: 0.0026612	total: 31s	remaining: 1.41s
287:	learn: 0.0026611	total: 31.1s	remaining: 1.3s
288:	learn: 0.0026610	total: 31.2s	remaining: 1.19s
289:	learn: 0.0026609	total: 31.3s	remaining: 1.08s
290:	learn: 0.0026608	total: 31.4s	remaining: 970ms
291:	learn: 0.0026607	total: 31.5s	remaining: 862ms
292:	learn: 0.0026607	total: 31.6s	remaining: 754ms
293:	learn: 0.0026607	total: 31.6s	remaining: 646ms
294:	learn: 0.0026607	total: 31.7s	remaining: 538ms
295:	learn: 0.0026607	total: 31.8s	remaining: 430ms
296:	learn: 0.0026606	total: 31.9s	remaining: 322ms
297:	learn: 0.0026606	total: 32s	remaining: 215ms
298:	learn: 0.0026605	total: 32.1s	remaining: 107ms
299:	learn: 0.0026606	total: 32.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.77
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.89
 - F1-Score_Train: 99.89
 - Precision_Test: 34.89
 - Recall_Test: 88.89
 - AUPRC_Test: 78.76
 - Accuracy_Test: 99.70
 - F1-Score_Test: 50.11
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 300
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.06
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4843108	total: 84.6ms	remaining: 25.3s
1:	learn: 0.3192416	total: 170ms	remaining: 25.3s
2:	learn: 0.2246759	total: 257ms	remaining: 25.5s
3:	learn: 0.1637588	total: 373ms	remaining: 27.6s
4:	learn: 0.1376057	total: 472ms	remaining: 27.9s
5:	learn: 0.1179659	total: 555ms	remaining: 27.2s
6:	learn: 0.1031389	total: 663ms	remaining: 27.8s
7:	learn: 0.0929596	total: 748ms	remaining: 27.3s
8:	learn: 0.0837348	total: 837ms	remaining: 27.1s
9:	learn: 0.0760804	total: 947ms	remaining: 27.5s
10:	learn: 0.0695936	total: 1.04s	remaining: 27.3s
11:	learn: 0.0647149	total: 1.14s	remaining: 27.3s
12:	learn: 0.0603270	total: 1.28s	remaining: 28.3s
13:	learn: 0.0563390	total: 1.37s	remaining: 28s
14:	learn: 0.0534565	total: 1.46s	remaining: 27.8s
15:	learn: 0.0507019	total: 1.6s	remaining: 28.4s
16:	learn: 0.0485562	total: 1.69s	remaining: 28.2s
17:	learn: 0.0459481	total: 1.78s	remaining: 28s
18:	learn: 0.0437058	total: 1.9s	remaining: 28.1s
19:	learn: 0.0421133	total: 1.98s	remaining: 27.8s
20:	learn: 0.0402891	total: 2.08s	remaining: 27.6s
21:	learn: 0.0385478	total: 2.19s	remaining: 27.7s
22:	learn: 0.0368835	total: 2.28s	remaining: 27.5s
23:	learn: 0.0355519	total: 2.36s	remaining: 27.2s
24:	learn: 0.0342800	total: 2.48s	remaining: 27.3s
25:	learn: 0.0331748	total: 2.58s	remaining: 27.3s
26:	learn: 0.0313630	total: 2.67s	remaining: 27s
27:	learn: 0.0302998	total: 2.78s	remaining: 27s
28:	learn: 0.0295547	total: 2.87s	remaining: 26.8s
29:	learn: 0.0284858	total: 2.96s	remaining: 26.6s
30:	learn: 0.0272007	total: 3.08s	remaining: 26.7s
31:	learn: 0.0261933	total: 3.17s	remaining: 26.5s
32:	learn: 0.0252269	total: 3.26s	remaining: 26.4s
33:	learn: 0.0242958	total: 3.37s	remaining: 26.4s
34:	learn: 0.0235693	total: 3.47s	remaining: 26.2s
35:	learn: 0.0230958	total: 3.55s	remaining: 26s
36:	learn: 0.0226226	total: 3.68s	remaining: 26.1s
37:	learn: 0.0218471	total: 3.77s	remaining: 26s
38:	learn: 0.0214949	total: 3.85s	remaining: 25.8s
39:	learn: 0.0209485	total: 3.96s	remaining: 25.7s
40:	learn: 0.0203008	total: 4.08s	remaining: 25.8s
41:	learn: 0.0196460	total: 4.23s	remaining: 26s
42:	learn: 0.0190439	total: 4.4s	remaining: 26.3s
43:	learn: 0.0185185	total: 4.56s	remaining: 26.5s
44:	learn: 0.0176003	total: 4.74s	remaining: 26.9s
45:	learn: 0.0172520	total: 4.93s	remaining: 27.2s
46:	learn: 0.0167183	total: 5.09s	remaining: 27.4s
47:	learn: 0.0164169	total: 5.24s	remaining: 27.5s
48:	learn: 0.0158210	total: 5.41s	remaining: 27.7s
49:	learn: 0.0152991	total: 5.59s	remaining: 28s
50:	learn: 0.0149268	total: 5.77s	remaining: 28.2s
51:	learn: 0.0145565	total: 5.94s	remaining: 28.3s
52:	learn: 0.0143680	total: 6.08s	remaining: 28.3s
53:	learn: 0.0138870	total: 6.27s	remaining: 28.6s
54:	learn: 0.0134688	total: 6.46s	remaining: 28.8s
55:	learn: 0.0132564	total: 6.64s	remaining: 28.9s
56:	learn: 0.0128622	total: 6.81s	remaining: 29s
57:	learn: 0.0126357	total: 6.98s	remaining: 29.1s
58:	learn: 0.0123456	total: 7.14s	remaining: 29.2s
59:	learn: 0.0120657	total: 7.33s	remaining: 29.3s
60:	learn: 0.0117456	total: 7.5s	remaining: 29.4s
61:	learn: 0.0115289	total: 7.67s	remaining: 29.4s
62:	learn: 0.0112291	total: 7.84s	remaining: 29.5s
63:	learn: 0.0108854	total: 8.02s	remaining: 29.6s
64:	learn: 0.0106973	total: 8.18s	remaining: 29.6s
65:	learn: 0.0104864	total: 8.37s	remaining: 29.7s
66:	learn: 0.0102180	total: 8.54s	remaining: 29.7s
67:	learn: 0.0101020	total: 8.71s	remaining: 29.7s
68:	learn: 0.0098291	total: 8.89s	remaining: 29.8s
69:	learn: 0.0096522	total: 9.06s	remaining: 29.8s
70:	learn: 0.0094450	total: 9.22s	remaining: 29.8s
71:	learn: 0.0092461	total: 9.4s	remaining: 29.8s
72:	learn: 0.0090579	total: 9.58s	remaining: 29.8s
73:	learn: 0.0089342	total: 9.77s	remaining: 29.8s
74:	learn: 0.0088263	total: 9.94s	remaining: 29.8s
75:	learn: 0.0086016	total: 10.1s	remaining: 29.8s
76:	learn: 0.0084478	total: 10.2s	remaining: 29.6s
77:	learn: 0.0083932	total: 10.3s	remaining: 29.3s
78:	learn: 0.0081830	total: 10.4s	remaining: 29.2s
79:	learn: 0.0080443	total: 10.5s	remaining: 28.9s
80:	learn: 0.0078959	total: 10.6s	remaining: 28.7s
81:	learn: 0.0078110	total: 10.7s	remaining: 28.5s
82:	learn: 0.0075957	total: 10.8s	remaining: 28.3s
83:	learn: 0.0074654	total: 10.9s	remaining: 28s
84:	learn: 0.0072496	total: 11.1s	remaining: 28s
85:	learn: 0.0071170	total: 11.1s	remaining: 27.7s
86:	learn: 0.0070673	total: 11.2s	remaining: 27.5s
87:	learn: 0.0069332	total: 11.3s	remaining: 27.3s
88:	learn: 0.0069075	total: 11.4s	remaining: 27.1s
89:	learn: 0.0067434	total: 11.5s	remaining: 26.9s
90:	learn: 0.0066050	total: 11.6s	remaining: 26.7s
91:	learn: 0.0064366	total: 11.7s	remaining: 26.5s
92:	learn: 0.0064156	total: 11.8s	remaining: 26.3s
93:	learn: 0.0063007	total: 11.9s	remaining: 26.1s
94:	learn: 0.0061226	total: 12s	remaining: 25.9s
95:	learn: 0.0060427	total: 12.1s	remaining: 25.8s
96:	learn: 0.0059630	total: 12.2s	remaining: 25.6s
97:	learn: 0.0058690	total: 12.3s	remaining: 25.4s
98:	learn: 0.0057899	total: 12.4s	remaining: 25.3s
99:	learn: 0.0057288	total: 12.5s	remaining: 25s
100:	learn: 0.0056773	total: 12.6s	remaining: 24.8s
101:	learn: 0.0056044	total: 12.7s	remaining: 24.7s
102:	learn: 0.0055487	total: 12.8s	remaining: 24.5s
103:	learn: 0.0054936	total: 12.9s	remaining: 24.3s
104:	learn: 0.0054056	total: 13s	remaining: 24.1s
105:	learn: 0.0053622	total: 13.1s	remaining: 24s
106:	learn: 0.0052978	total: 13.2s	remaining: 23.8s
107:	learn: 0.0051623	total: 13.3s	remaining: 23.6s
108:	learn: 0.0050829	total: 13.4s	remaining: 23.5s
109:	learn: 0.0050203	total: 13.5s	remaining: 23.3s
110:	learn: 0.0049759	total: 13.6s	remaining: 23.1s
111:	learn: 0.0049075	total: 13.7s	remaining: 22.9s
112:	learn: 0.0049073	total: 13.7s	remaining: 22.7s
113:	learn: 0.0048579	total: 13.8s	remaining: 22.6s
114:	learn: 0.0048187	total: 13.9s	remaining: 22.4s
115:	learn: 0.0047549	total: 14s	remaining: 22.2s
116:	learn: 0.0047185	total: 14.1s	remaining: 22.1s
117:	learn: 0.0046647	total: 14.2s	remaining: 21.9s
118:	learn: 0.0045927	total: 14.3s	remaining: 21.8s
119:	learn: 0.0045329	total: 14.5s	remaining: 21.7s
120:	learn: 0.0044767	total: 14.5s	remaining: 21.5s
121:	learn: 0.0043689	total: 14.6s	remaining: 21.3s
122:	learn: 0.0043158	total: 14.7s	remaining: 21.2s
123:	learn: 0.0042655	total: 14.8s	remaining: 21s
124:	learn: 0.0042346	total: 14.9s	remaining: 20.9s
125:	learn: 0.0041747	total: 15s	remaining: 20.7s
126:	learn: 0.0041095	total: 15.1s	remaining: 20.6s
127:	learn: 0.0040513	total: 15.2s	remaining: 20.4s
128:	learn: 0.0039688	total: 15.3s	remaining: 20.3s
129:	learn: 0.0039264	total: 15.4s	remaining: 20.1s
130:	learn: 0.0039264	total: 15.5s	remaining: 20s
131:	learn: 0.0038612	total: 15.6s	remaining: 19.8s
132:	learn: 0.0038256	total: 15.7s	remaining: 19.7s
133:	learn: 0.0037642	total: 15.8s	remaining: 19.5s
134:	learn: 0.0037144	total: 15.9s	remaining: 19.4s
135:	learn: 0.0036348	total: 16s	remaining: 19.2s
136:	learn: 0.0036348	total: 16s	remaining: 19.1s
137:	learn: 0.0036203	total: 16.1s	remaining: 19s
138:	learn: 0.0035692	total: 16.3s	remaining: 18.8s
139:	learn: 0.0035692	total: 16.3s	remaining: 18.7s
140:	learn: 0.0035692	total: 16.4s	remaining: 18.5s
141:	learn: 0.0035692	total: 16.5s	remaining: 18.4s
142:	learn: 0.0035692	total: 16.6s	remaining: 18.2s
143:	learn: 0.0035692	total: 16.6s	remaining: 18s
144:	learn: 0.0035692	total: 16.7s	remaining: 17.9s
145:	learn: 0.0035273	total: 16.8s	remaining: 17.7s
146:	learn: 0.0035273	total: 16.9s	remaining: 17.5s
147:	learn: 0.0035273	total: 16.9s	remaining: 17.4s
148:	learn: 0.0034945	total: 17s	remaining: 17.3s
149:	learn: 0.0034719	total: 17.1s	remaining: 17.1s
150:	learn: 0.0034079	total: 17.2s	remaining: 17s
151:	learn: 0.0033733	total: 17.3s	remaining: 16.9s
152:	learn: 0.0033218	total: 17.4s	remaining: 16.7s
153:	learn: 0.0032547	total: 17.5s	remaining: 16.6s
154:	learn: 0.0032547	total: 17.6s	remaining: 16.5s
155:	learn: 0.0032414	total: 17.7s	remaining: 16.3s
156:	learn: 0.0032012	total: 17.8s	remaining: 16.2s
157:	learn: 0.0031744	total: 17.9s	remaining: 16.1s
158:	learn: 0.0031351	total: 18s	remaining: 15.9s
159:	learn: 0.0031034	total: 18.1s	remaining: 15.8s
160:	learn: 0.0031034	total: 18.1s	remaining: 15.7s
161:	learn: 0.0031034	total: 18.2s	remaining: 15.5s
162:	learn: 0.0031034	total: 18.3s	remaining: 15.4s
163:	learn: 0.0031034	total: 18.4s	remaining: 15.3s
164:	learn: 0.0031034	total: 18.5s	remaining: 15.1s
165:	learn: 0.0031034	total: 18.6s	remaining: 15s
166:	learn: 0.0031034	total: 18.6s	remaining: 14.8s
167:	learn: 0.0031034	total: 18.7s	remaining: 14.7s
168:	learn: 0.0031034	total: 18.8s	remaining: 14.5s
169:	learn: 0.0031034	total: 18.9s	remaining: 14.4s
170:	learn: 0.0031034	total: 18.9s	remaining: 14.3s
171:	learn: 0.0031034	total: 19s	remaining: 14.1s
172:	learn: 0.0031034	total: 19.1s	remaining: 14s
173:	learn: 0.0031033	total: 19.1s	remaining: 13.9s
174:	learn: 0.0031034	total: 19.2s	remaining: 13.7s
175:	learn: 0.0031034	total: 19.3s	remaining: 13.6s
176:	learn: 0.0031034	total: 19.4s	remaining: 13.5s
177:	learn: 0.0031033	total: 19.5s	remaining: 13.3s
178:	learn: 0.0031033	total: 19.5s	remaining: 13.2s
179:	learn: 0.0031033	total: 19.6s	remaining: 13.1s
180:	learn: 0.0031033	total: 19.7s	remaining: 12.9s
181:	learn: 0.0031033	total: 19.7s	remaining: 12.8s
182:	learn: 0.0031033	total: 19.8s	remaining: 12.7s
183:	learn: 0.0031033	total: 19.9s	remaining: 12.5s
184:	learn: 0.0031033	total: 20s	remaining: 12.4s
185:	learn: 0.0031033	total: 20s	remaining: 12.3s
186:	learn: 0.0031032	total: 20.1s	remaining: 12.2s
187:	learn: 0.0031032	total: 20.3s	remaining: 12.1s
188:	learn: 0.0031032	total: 20.4s	remaining: 12s
189:	learn: 0.0031032	total: 20.5s	remaining: 11.9s
190:	learn: 0.0031032	total: 20.6s	remaining: 11.8s
191:	learn: 0.0031032	total: 20.8s	remaining: 11.7s
192:	learn: 0.0031032	total: 20.9s	remaining: 11.6s
193:	learn: 0.0031033	total: 21.1s	remaining: 11.5s
194:	learn: 0.0031032	total: 21.2s	remaining: 11.4s
195:	learn: 0.0031032	total: 21.3s	remaining: 11.3s
196:	learn: 0.0031032	total: 21.5s	remaining: 11.2s
197:	learn: 0.0031032	total: 21.6s	remaining: 11.1s
198:	learn: 0.0031032	total: 21.7s	remaining: 11s
199:	learn: 0.0031032	total: 21.9s	remaining: 10.9s
200:	learn: 0.0031032	total: 22s	remaining: 10.8s
201:	learn: 0.0031032	total: 22.1s	remaining: 10.7s
202:	learn: 0.0031032	total: 22.3s	remaining: 10.7s
203:	learn: 0.0031032	total: 22.4s	remaining: 10.6s
204:	learn: 0.0031032	total: 22.6s	remaining: 10.5s
205:	learn: 0.0031031	total: 22.7s	remaining: 10.4s
206:	learn: 0.0031031	total: 22.9s	remaining: 10.3s
207:	learn: 0.0031031	total: 23s	remaining: 10.2s
208:	learn: 0.0031031	total: 23.2s	remaining: 10.1s
209:	learn: 0.0031031	total: 23.3s	remaining: 9.98s
210:	learn: 0.0031031	total: 23.4s	remaining: 9.88s
211:	learn: 0.0031031	total: 23.6s	remaining: 9.79s
212:	learn: 0.0031031	total: 23.7s	remaining: 9.69s
213:	learn: 0.0031031	total: 23.9s	remaining: 9.59s
214:	learn: 0.0031031	total: 24s	remaining: 9.49s
215:	learn: 0.0031031	total: 24.1s	remaining: 9.39s
216:	learn: 0.0031031	total: 24.3s	remaining: 9.29s
217:	learn: 0.0031031	total: 24.4s	remaining: 9.18s
218:	learn: 0.0031031	total: 24.5s	remaining: 9.08s
219:	learn: 0.0031031	total: 24.7s	remaining: 8.98s
220:	learn: 0.0031031	total: 24.8s	remaining: 8.88s
221:	learn: 0.0031030	total: 25s	remaining: 8.78s
222:	learn: 0.0031031	total: 25.1s	remaining: 8.67s
223:	learn: 0.0031031	total: 25.3s	remaining: 8.57s
224:	learn: 0.0031031	total: 25.4s	remaining: 8.47s
225:	learn: 0.0031030	total: 25.5s	remaining: 8.36s
226:	learn: 0.0031030	total: 25.7s	remaining: 8.26s
227:	learn: 0.0031030	total: 25.8s	remaining: 8.16s
228:	learn: 0.0031030	total: 26s	remaining: 8.06s
229:	learn: 0.0031030	total: 26.1s	remaining: 7.96s
230:	learn: 0.0031030	total: 26.3s	remaining: 7.84s
231:	learn: 0.0031030	total: 26.4s	remaining: 7.74s
232:	learn: 0.0031030	total: 26.6s	remaining: 7.64s
233:	learn: 0.0031030	total: 26.7s	remaining: 7.53s
234:	learn: 0.0031030	total: 26.9s	remaining: 7.43s
235:	learn: 0.0031030	total: 27s	remaining: 7.32s
236:	learn: 0.0031030	total: 27.1s	remaining: 7.21s
237:	learn: 0.0031029	total: 27.3s	remaining: 7.11s
238:	learn: 0.0031029	total: 27.4s	remaining: 7s
239:	learn: 0.0031029	total: 27.6s	remaining: 6.9s
240:	learn: 0.0031029	total: 27.7s	remaining: 6.78s
241:	learn: 0.0031029	total: 27.9s	remaining: 6.68s
242:	learn: 0.0031029	total: 28s	remaining: 6.58s
243:	learn: 0.0031029	total: 28.2s	remaining: 6.47s
244:	learn: 0.0031029	total: 28.3s	remaining: 6.36s
245:	learn: 0.0031029	total: 28.5s	remaining: 6.25s
246:	learn: 0.0031029	total: 28.6s	remaining: 6.14s
247:	learn: 0.0031029	total: 28.8s	remaining: 6.03s
248:	learn: 0.0031029	total: 28.9s	remaining: 5.92s
249:	learn: 0.0031029	total: 29.1s	remaining: 5.81s
250:	learn: 0.0031029	total: 29.2s	remaining: 5.7s
251:	learn: 0.0031029	total: 29.4s	remaining: 5.59s
252:	learn: 0.0031029	total: 29.5s	remaining: 5.48s
253:	learn: 0.0031029	total: 29.6s	remaining: 5.37s
254:	learn: 0.0031029	total: 29.8s	remaining: 5.25s
255:	learn: 0.0031029	total: 29.9s	remaining: 5.14s
256:	learn: 0.0031029	total: 30s	remaining: 5.02s
257:	learn: 0.0031029	total: 30.2s	remaining: 4.91s
258:	learn: 0.0031029	total: 30.3s	remaining: 4.79s
259:	learn: 0.0031029	total: 30.4s	remaining: 4.68s
260:	learn: 0.0031028	total: 30.6s	remaining: 4.57s
261:	learn: 0.0031029	total: 30.7s	remaining: 4.45s
262:	learn: 0.0031029	total: 30.8s	remaining: 4.33s
263:	learn: 0.0030672	total: 31s	remaining: 4.22s
264:	learn: 0.0030672	total: 31.1s	remaining: 4.1s
265:	learn: 0.0030640	total: 31.2s	remaining: 3.98s
266:	learn: 0.0030640	total: 31.2s	remaining: 3.86s
267:	learn: 0.0030640	total: 31.3s	remaining: 3.74s
268:	learn: 0.0030640	total: 31.4s	remaining: 3.62s
269:	learn: 0.0030640	total: 31.5s	remaining: 3.5s
270:	learn: 0.0030640	total: 31.5s	remaining: 3.38s
271:	learn: 0.0030484	total: 31.6s	remaining: 3.25s
272:	learn: 0.0030484	total: 31.7s	remaining: 3.14s
273:	learn: 0.0030483	total: 31.8s	remaining: 3.02s
274:	learn: 0.0030483	total: 31.8s	remaining: 2.9s
275:	learn: 0.0030483	total: 31.9s	remaining: 2.77s
276:	learn: 0.0030483	total: 32s	remaining: 2.66s
277:	learn: 0.0030483	total: 32.1s	remaining: 2.54s
278:	learn: 0.0030483	total: 32.2s	remaining: 2.42s
279:	learn: 0.0030484	total: 32.2s	remaining: 2.3s
280:	learn: 0.0030484	total: 32.3s	remaining: 2.19s
281:	learn: 0.0030484	total: 32.4s	remaining: 2.07s
282:	learn: 0.0030484	total: 32.5s	remaining: 1.95s
283:	learn: 0.0030484	total: 32.5s	remaining: 1.83s
284:	learn: 0.0030483	total: 32.6s	remaining: 1.72s
285:	learn: 0.0030483	total: 32.7s	remaining: 1.6s
286:	learn: 0.0030483	total: 32.8s	remaining: 1.48s
287:	learn: 0.0030483	total: 32.9s	remaining: 1.37s
288:	learn: 0.0030483	total: 32.9s	remaining: 1.25s
289:	learn: 0.0030483	total: 33s	remaining: 1.14s
290:	learn: 0.0030483	total: 33.1s	remaining: 1.02s
291:	learn: 0.0030483	total: 33.2s	remaining: 909ms
292:	learn: 0.0030483	total: 33.3s	remaining: 794ms
293:	learn: 0.0030482	total: 33.3s	remaining: 681ms
294:	learn: 0.0030482	total: 33.4s	remaining: 566ms
295:	learn: 0.0030482	total: 33.5s	remaining: 453ms
296:	learn: 0.0030482	total: 33.6s	remaining: 339ms
297:	learn: 0.0030482	total: 33.6s	remaining: 226ms
298:	learn: 0.0030482	total: 33.7s	remaining: 113ms
299:	learn: 0.0030482	total: 33.8s	remaining: 0us
[I 2024-12-19 14:57:16,174] Trial 38 finished with value: 78.87409870357558 and parameters: {'learning_rate': 0.09745835195680903, 'max_depth': 6, 'n_estimators': 300, 'scale_pos_weight': 6.057068837515901}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 29.03
 - Recall_Test: 85.71
 - AUPRC_Test: 79.37
 - Accuracy_Test: 99.62
 - F1-Score_Test: 43.37
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 300
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.10
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.06
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 78.8741

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4993088	total: 86.3ms	remaining: 24.7s
1:	learn: 0.3340092	total: 194ms	remaining: 27.6s
2:	learn: 0.2438067	total: 282ms	remaining: 26.7s
3:	learn: 0.1837864	total: 397ms	remaining: 28.1s
4:	learn: 0.1412311	total: 490ms	remaining: 27.6s
5:	learn: 0.1120741	total: 579ms	remaining: 27.1s
6:	learn: 0.0927377	total: 703ms	remaining: 28.1s
7:	learn: 0.0812817	total: 794ms	remaining: 27.7s
8:	learn: 0.0701734	total: 886ms	remaining: 27.4s
9:	learn: 0.0630580	total: 1.01s	remaining: 27.9s
10:	learn: 0.0564541	total: 1.1s	remaining: 27.6s
11:	learn: 0.0529628	total: 1.21s	remaining: 27.8s
12:	learn: 0.0489816	total: 1.33s	remaining: 28s
13:	learn: 0.0456501	total: 1.42s	remaining: 27.6s
14:	learn: 0.0430133	total: 1.53s	remaining: 27.7s
15:	learn: 0.0404104	total: 1.62s	remaining: 27.5s
16:	learn: 0.0383449	total: 1.72s	remaining: 27.3s
17:	learn: 0.0363273	total: 1.83s	remaining: 27.4s
18:	learn: 0.0350744	total: 1.92s	remaining: 27.1s
19:	learn: 0.0334491	total: 2.01s	remaining: 26.8s
20:	learn: 0.0325551	total: 2.15s	remaining: 27.2s
21:	learn: 0.0312673	total: 2.27s	remaining: 27.3s
22:	learn: 0.0303744	total: 2.36s	remaining: 27.1s
23:	learn: 0.0291844	total: 2.46s	remaining: 27s
24:	learn: 0.0278087	total: 2.55s	remaining: 26.7s
25:	learn: 0.0268666	total: 2.67s	remaining: 26.8s
26:	learn: 0.0259245	total: 2.75s	remaining: 26.5s
27:	learn: 0.0250829	total: 2.84s	remaining: 26.3s
28:	learn: 0.0242636	total: 2.95s	remaining: 26.3s
29:	learn: 0.0233913	total: 3.08s	remaining: 26.4s
30:	learn: 0.0223439	total: 3.23s	remaining: 26.6s
31:	learn: 0.0214407	total: 3.4s	remaining: 27.1s
32:	learn: 0.0207605	total: 3.59s	remaining: 27.6s
33:	learn: 0.0199491	total: 3.76s	remaining: 28s
34:	learn: 0.0194339	total: 3.94s	remaining: 28.4s
35:	learn: 0.0187147	total: 4.11s	remaining: 28.7s
36:	learn: 0.0180919	total: 4.27s	remaining: 28.9s
37:	learn: 0.0177120	total: 4.44s	remaining: 29.1s
38:	learn: 0.0172239	total: 4.61s	remaining: 29.3s
39:	learn: 0.0167194	total: 4.77s	remaining: 29.5s
40:	learn: 0.0162309	total: 4.95s	remaining: 29.7s
41:	learn: 0.0156878	total: 5.11s	remaining: 29.8s
42:	learn: 0.0152346	total: 5.28s	remaining: 29.9s
43:	learn: 0.0150503	total: 5.43s	remaining: 30s
44:	learn: 0.0147233	total: 5.59s	remaining: 30.1s
45:	learn: 0.0142968	total: 5.74s	remaining: 30.1s
46:	learn: 0.0139457	total: 5.94s	remaining: 30.4s
47:	learn: 0.0135596	total: 6.12s	remaining: 30.5s
48:	learn: 0.0133402	total: 6.3s	remaining: 30.6s
49:	learn: 0.0129181	total: 6.5s	remaining: 30.8s
50:	learn: 0.0126560	total: 6.66s	remaining: 30.8s
51:	learn: 0.0122839	total: 6.82s	remaining: 30.8s
52:	learn: 0.0120439	total: 7s	remaining: 30.9s
53:	learn: 0.0117880	total: 7.15s	remaining: 30.9s
54:	learn: 0.0114091	total: 7.34s	remaining: 31s
55:	learn: 0.0110641	total: 7.51s	remaining: 31s
56:	learn: 0.0108875	total: 7.67s	remaining: 31s
57:	learn: 0.0107736	total: 7.82s	remaining: 30.9s
58:	learn: 0.0105316	total: 7.99s	remaining: 30.9s
59:	learn: 0.0104348	total: 8.14s	remaining: 30.8s
60:	learn: 0.0102403	total: 8.32s	remaining: 30.8s
61:	learn: 0.0100756	total: 8.49s	remaining: 30.8s
62:	learn: 0.0097946	total: 8.67s	remaining: 30.8s
63:	learn: 0.0096846	total: 8.75s	remaining: 30.5s
64:	learn: 0.0094959	total: 8.84s	remaining: 30.2s
65:	learn: 0.0092056	total: 8.96s	remaining: 30s
66:	learn: 0.0090081	total: 9.05s	remaining: 29.7s
67:	learn: 0.0088125	total: 9.13s	remaining: 29.4s
68:	learn: 0.0086699	total: 9.25s	remaining: 29.2s
69:	learn: 0.0085606	total: 9.35s	remaining: 29s
70:	learn: 0.0084401	total: 9.43s	remaining: 28.7s
71:	learn: 0.0083131	total: 9.54s	remaining: 28.5s
72:	learn: 0.0081235	total: 9.64s	remaining: 28.3s
73:	learn: 0.0079316	total: 9.74s	remaining: 28s
74:	learn: 0.0078507	total: 9.85s	remaining: 27.8s
75:	learn: 0.0077230	total: 9.94s	remaining: 27.6s
76:	learn: 0.0075828	total: 10s	remaining: 27.4s
77:	learn: 0.0074611	total: 10.2s	remaining: 27.3s
78:	learn: 0.0072795	total: 10.3s	remaining: 27.1s
79:	learn: 0.0071376	total: 10.4s	remaining: 26.8s
80:	learn: 0.0070596	total: 10.5s	remaining: 26.6s
81:	learn: 0.0069307	total: 10.6s	remaining: 26.4s
82:	learn: 0.0068508	total: 10.7s	remaining: 26.2s
83:	learn: 0.0066818	total: 10.8s	remaining: 26s
84:	learn: 0.0066288	total: 10.9s	remaining: 25.8s
85:	learn: 0.0065370	total: 11s	remaining: 25.6s
86:	learn: 0.0064235	total: 11.1s	remaining: 25.5s
87:	learn: 0.0062815	total: 11.2s	remaining: 25.2s
88:	learn: 0.0061797	total: 11.3s	remaining: 25s
89:	learn: 0.0060257	total: 11.4s	remaining: 24.9s
90:	learn: 0.0059752	total: 11.4s	remaining: 24.7s
91:	learn: 0.0058169	total: 11.5s	remaining: 24.5s
92:	learn: 0.0057421	total: 11.7s	remaining: 24.3s
93:	learn: 0.0056728	total: 11.8s	remaining: 24.2s
94:	learn: 0.0055673	total: 11.8s	remaining: 23.9s
95:	learn: 0.0054885	total: 12s	remaining: 23.8s
96:	learn: 0.0054347	total: 12.1s	remaining: 23.6s
97:	learn: 0.0053667	total: 12.2s	remaining: 23.4s
98:	learn: 0.0052804	total: 12.3s	remaining: 23.3s
99:	learn: 0.0052415	total: 12.3s	remaining: 23.1s
100:	learn: 0.0051600	total: 12.4s	remaining: 22.9s
101:	learn: 0.0050630	total: 12.6s	remaining: 22.8s
102:	learn: 0.0049110	total: 12.6s	remaining: 22.6s
103:	learn: 0.0048113	total: 12.8s	remaining: 22.4s
104:	learn: 0.0047422	total: 12.9s	remaining: 22.3s
105:	learn: 0.0046957	total: 12.9s	remaining: 22.1s
106:	learn: 0.0045882	total: 13.1s	remaining: 22s
107:	learn: 0.0045265	total: 13.2s	remaining: 21.8s
108:	learn: 0.0044823	total: 13.2s	remaining: 21.6s
109:	learn: 0.0044518	total: 13.4s	remaining: 21.5s
110:	learn: 0.0044518	total: 13.4s	remaining: 21.3s
111:	learn: 0.0043526	total: 13.5s	remaining: 21.1s
112:	learn: 0.0043164	total: 13.6s	remaining: 21s
113:	learn: 0.0043164	total: 13.7s	remaining: 20.8s
114:	learn: 0.0042827	total: 13.8s	remaining: 20.6s
115:	learn: 0.0042649	total: 13.9s	remaining: 20.5s
116:	learn: 0.0042236	total: 14s	remaining: 20.3s
117:	learn: 0.0042006	total: 14.1s	remaining: 20.1s
118:	learn: 0.0041643	total: 14.2s	remaining: 20s
119:	learn: 0.0041643	total: 14.2s	remaining: 19.8s
120:	learn: 0.0040992	total: 14.3s	remaining: 19.6s
121:	learn: 0.0040785	total: 14.4s	remaining: 19.5s
122:	learn: 0.0040544	total: 14.5s	remaining: 19.3s
123:	learn: 0.0040335	total: 14.6s	remaining: 19.2s
124:	learn: 0.0039825	total: 14.7s	remaining: 19s
125:	learn: 0.0039210	total: 14.8s	remaining: 18.9s
126:	learn: 0.0038962	total: 14.9s	remaining: 18.7s
127:	learn: 0.0038637	total: 15s	remaining: 18.6s
128:	learn: 0.0038025	total: 15.1s	remaining: 18.5s
129:	learn: 0.0037318	total: 15.2s	remaining: 18.3s
130:	learn: 0.0036774	total: 15.3s	remaining: 18.2s
131:	learn: 0.0036551	total: 15.4s	remaining: 18s
132:	learn: 0.0036148	total: 15.4s	remaining: 17.9s
133:	learn: 0.0036148	total: 15.5s	remaining: 17.7s
134:	learn: 0.0035781	total: 15.6s	remaining: 17.6s
135:	learn: 0.0035480	total: 15.7s	remaining: 17.4s
136:	learn: 0.0035480	total: 15.8s	remaining: 17.3s
137:	learn: 0.0035299	total: 15.9s	remaining: 17.2s
138:	learn: 0.0035299	total: 16s	remaining: 17s
139:	learn: 0.0035008	total: 16.1s	remaining: 16.9s
140:	learn: 0.0035008	total: 16.1s	remaining: 16.7s
141:	learn: 0.0035008	total: 16.2s	remaining: 16.6s
142:	learn: 0.0035008	total: 16.3s	remaining: 16.4s
143:	learn: 0.0035008	total: 16.4s	remaining: 16.3s
144:	learn: 0.0035008	total: 16.5s	remaining: 16.1s
145:	learn: 0.0034780	total: 16.6s	remaining: 16s
146:	learn: 0.0034780	total: 16.7s	remaining: 15.9s
147:	learn: 0.0034780	total: 16.7s	remaining: 15.7s
148:	learn: 0.0034356	total: 16.8s	remaining: 15.6s
149:	learn: 0.0034094	total: 16.9s	remaining: 15.5s
150:	learn: 0.0033521	total: 17s	remaining: 15.3s
151:	learn: 0.0033521	total: 17.1s	remaining: 15.2s
152:	learn: 0.0033033	total: 17.2s	remaining: 15.1s
153:	learn: 0.0033033	total: 17.3s	remaining: 14.9s
154:	learn: 0.0033033	total: 17.4s	remaining: 14.8s
155:	learn: 0.0032815	total: 17.5s	remaining: 14.7s
156:	learn: 0.0032369	total: 17.6s	remaining: 14.5s
157:	learn: 0.0032369	total: 17.6s	remaining: 14.4s
158:	learn: 0.0032369	total: 17.7s	remaining: 14.3s
159:	learn: 0.0032369	total: 17.8s	remaining: 14.1s
160:	learn: 0.0032218	total: 17.9s	remaining: 14s
161:	learn: 0.0032218	total: 18s	remaining: 13.9s
162:	learn: 0.0032218	total: 18s	remaining: 13.7s
163:	learn: 0.0032154	total: 18.1s	remaining: 13.6s
164:	learn: 0.0031858	total: 18.2s	remaining: 13.5s
165:	learn: 0.0031858	total: 18.3s	remaining: 13.3s
166:	learn: 0.0031858	total: 18.4s	remaining: 13.2s
167:	learn: 0.0031096	total: 18.5s	remaining: 13.1s
168:	learn: 0.0031096	total: 18.5s	remaining: 12.9s
169:	learn: 0.0030916	total: 18.6s	remaining: 12.8s
170:	learn: 0.0030514	total: 18.8s	remaining: 12.7s
171:	learn: 0.0030218	total: 18.9s	remaining: 12.7s
172:	learn: 0.0029752	total: 19.1s	remaining: 12.6s
173:	learn: 0.0029752	total: 19.2s	remaining: 12.5s
174:	learn: 0.0029751	total: 19.4s	remaining: 12.4s
175:	learn: 0.0029751	total: 19.5s	remaining: 12.3s
176:	learn: 0.0029751	total: 19.6s	remaining: 12.2s
177:	learn: 0.0029615	total: 19.8s	remaining: 12.1s
178:	learn: 0.0029615	total: 19.9s	remaining: 12s
179:	learn: 0.0029567	total: 20.1s	remaining: 12s
180:	learn: 0.0029567	total: 20.3s	remaining: 11.9s
181:	learn: 0.0029567	total: 20.4s	remaining: 11.8s
182:	learn: 0.0029567	total: 20.5s	remaining: 11.7s
183:	learn: 0.0029567	total: 20.7s	remaining: 11.6s
184:	learn: 0.0029410	total: 20.8s	remaining: 11.5s
185:	learn: 0.0029094	total: 21s	remaining: 11.4s
186:	learn: 0.0028825	total: 21.2s	remaining: 11.3s
187:	learn: 0.0028825	total: 21.3s	remaining: 11.2s
188:	learn: 0.0028825	total: 21.5s	remaining: 11.1s
189:	learn: 0.0028825	total: 21.6s	remaining: 11s
190:	learn: 0.0028825	total: 21.7s	remaining: 10.9s
191:	learn: 0.0028825	total: 21.9s	remaining: 10.8s
192:	learn: 0.0028825	total: 22s	remaining: 10.7s
193:	learn: 0.0028825	total: 22.2s	remaining: 10.6s
194:	learn: 0.0028825	total: 22.3s	remaining: 10.5s
195:	learn: 0.0028825	total: 22.5s	remaining: 10.4s
196:	learn: 0.0028825	total: 22.6s	remaining: 10.3s
197:	learn: 0.0028223	total: 22.8s	remaining: 10.2s
198:	learn: 0.0028126	total: 22.9s	remaining: 10.1s
199:	learn: 0.0028126	total: 23s	remaining: 10s
200:	learn: 0.0028126	total: 23.2s	remaining: 9.92s
201:	learn: 0.0028126	total: 23.3s	remaining: 9.81s
202:	learn: 0.0028126	total: 23.5s	remaining: 9.71s
203:	learn: 0.0028126	total: 23.6s	remaining: 9.61s
204:	learn: 0.0028126	total: 23.8s	remaining: 9.5s
205:	learn: 0.0028126	total: 23.9s	remaining: 9.4s
206:	learn: 0.0028126	total: 24.1s	remaining: 9.29s
207:	learn: 0.0028126	total: 24.2s	remaining: 9.2s
208:	learn: 0.0028126	total: 24.4s	remaining: 9.09s
209:	learn: 0.0028126	total: 24.5s	remaining: 8.99s
210:	learn: 0.0028126	total: 24.6s	remaining: 8.88s
211:	learn: 0.0028126	total: 24.7s	remaining: 8.75s
212:	learn: 0.0028044	total: 24.8s	remaining: 8.63s
213:	learn: 0.0028044	total: 24.9s	remaining: 8.49s
214:	learn: 0.0028044	total: 25s	remaining: 8.37s
215:	learn: 0.0028044	total: 25.1s	remaining: 8.24s
216:	learn: 0.0028044	total: 25.2s	remaining: 8.11s
217:	learn: 0.0028044	total: 25.3s	remaining: 8s
218:	learn: 0.0028044	total: 25.4s	remaining: 7.87s
219:	learn: 0.0027945	total: 25.4s	remaining: 7.75s
220:	learn: 0.0027477	total: 25.6s	remaining: 7.63s
221:	learn: 0.0027477	total: 25.6s	remaining: 7.5s
222:	learn: 0.0027142	total: 25.7s	remaining: 7.38s
223:	learn: 0.0027142	total: 25.8s	remaining: 7.26s
224:	learn: 0.0027142	total: 25.9s	remaining: 7.14s
225:	learn: 0.0026840	total: 26s	remaining: 7.01s
226:	learn: 0.0026840	total: 26.1s	remaining: 6.89s
227:	learn: 0.0026840	total: 26.2s	remaining: 6.77s
228:	learn: 0.0026840	total: 26.2s	remaining: 6.65s
229:	learn: 0.0026839	total: 26.4s	remaining: 6.53s
230:	learn: 0.0026839	total: 26.4s	remaining: 6.41s
231:	learn: 0.0026839	total: 26.5s	remaining: 6.29s
232:	learn: 0.0026839	total: 26.6s	remaining: 6.17s
233:	learn: 0.0026839	total: 26.7s	remaining: 6.04s
234:	learn: 0.0026839	total: 26.8s	remaining: 5.92s
235:	learn: 0.0026839	total: 26.8s	remaining: 5.8s
236:	learn: 0.0026839	total: 26.9s	remaining: 5.68s
237:	learn: 0.0026839	total: 27s	remaining: 5.56s
238:	learn: 0.0026839	total: 27.1s	remaining: 5.44s
239:	learn: 0.0026839	total: 27.2s	remaining: 5.32s
240:	learn: 0.0026839	total: 27.2s	remaining: 5.2s
241:	learn: 0.0026839	total: 27.3s	remaining: 5.08s
242:	learn: 0.0026839	total: 27.4s	remaining: 4.96s
243:	learn: 0.0026839	total: 27.5s	remaining: 4.84s
244:	learn: 0.0026839	total: 27.6s	remaining: 4.73s
245:	learn: 0.0026839	total: 27.7s	remaining: 4.61s
246:	learn: 0.0026839	total: 27.7s	remaining: 4.49s
247:	learn: 0.0026839	total: 27.8s	remaining: 4.38s
248:	learn: 0.0026839	total: 27.9s	remaining: 4.26s
249:	learn: 0.0026839	total: 28s	remaining: 4.14s
250:	learn: 0.0026839	total: 28.1s	remaining: 4.03s
251:	learn: 0.0026839	total: 28.2s	remaining: 3.92s
252:	learn: 0.0026839	total: 28.3s	remaining: 3.8s
253:	learn: 0.0026839	total: 28.4s	remaining: 3.69s
254:	learn: 0.0026839	total: 28.5s	remaining: 3.57s
255:	learn: 0.0026839	total: 28.5s	remaining: 3.45s
256:	learn: 0.0026839	total: 28.6s	remaining: 3.34s
257:	learn: 0.0026839	total: 28.7s	remaining: 3.23s
258:	learn: 0.0026839	total: 28.8s	remaining: 3.11s
259:	learn: 0.0026839	total: 28.9s	remaining: 3s
260:	learn: 0.0026838	total: 28.9s	remaining: 2.88s
261:	learn: 0.0026838	total: 29s	remaining: 2.77s
262:	learn: 0.0026838	total: 29.1s	remaining: 2.65s
263:	learn: 0.0026838	total: 29.2s	remaining: 2.54s
264:	learn: 0.0026838	total: 29.3s	remaining: 2.43s
265:	learn: 0.0026838	total: 29.4s	remaining: 2.32s
266:	learn: 0.0026838	total: 29.4s	remaining: 2.21s
267:	learn: 0.0026838	total: 29.5s	remaining: 2.09s
268:	learn: 0.0026838	total: 29.6s	remaining: 1.98s
269:	learn: 0.0026838	total: 29.7s	remaining: 1.87s
270:	learn: 0.0026838	total: 29.8s	remaining: 1.76s
271:	learn: 0.0026838	total: 29.8s	remaining: 1.65s
272:	learn: 0.0026838	total: 29.9s	remaining: 1.53s
273:	learn: 0.0026838	total: 30s	remaining: 1.42s
274:	learn: 0.0026838	total: 30.1s	remaining: 1.31s
275:	learn: 0.0026838	total: 30.2s	remaining: 1.2s
276:	learn: 0.0026838	total: 30.2s	remaining: 1.09s
277:	learn: 0.0026838	total: 30.3s	remaining: 982ms
278:	learn: 0.0026838	total: 30.4s	remaining: 872ms
279:	learn: 0.0026838	total: 30.5s	remaining: 762ms
280:	learn: 0.0026838	total: 30.6s	remaining: 653ms
281:	learn: 0.0026838	total: 30.7s	remaining: 544ms
282:	learn: 0.0026838	total: 30.7s	remaining: 434ms
283:	learn: 0.0026838	total: 30.8s	remaining: 326ms
284:	learn: 0.0026837	total: 30.9s	remaining: 217ms
285:	learn: 0.0026838	total: 31s	remaining: 108ms
286:	learn: 0.0026838	total: 31.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.72
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 28.08
 - Recall_Test: 84.92
 - AUPRC_Test: 80.46
 - Accuracy_Test: 99.61
 - F1-Score_Test: 42.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 287
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4941965	total: 83.9ms	remaining: 24s
1:	learn: 0.3584212	total: 173ms	remaining: 24.6s
2:	learn: 0.2760034	total: 277ms	remaining: 26.2s
3:	learn: 0.2174277	total: 392ms	remaining: 27.8s
4:	learn: 0.1791629	total: 479ms	remaining: 27s
5:	learn: 0.1475324	total: 567ms	remaining: 26.5s
6:	learn: 0.1265298	total: 683ms	remaining: 27.3s
7:	learn: 0.1137890	total: 768ms	remaining: 26.8s
8:	learn: 0.1053409	total: 859ms	remaining: 26.5s
9:	learn: 0.0961945	total: 973ms	remaining: 26.9s
10:	learn: 0.0899136	total: 1.06s	remaining: 26.5s
11:	learn: 0.0844492	total: 1.15s	remaining: 26.4s
12:	learn: 0.0791464	total: 1.26s	remaining: 26.5s
13:	learn: 0.0735492	total: 1.39s	remaining: 27.1s
14:	learn: 0.0682268	total: 1.57s	remaining: 28.6s
15:	learn: 0.0642524	total: 1.75s	remaining: 29.7s
16:	learn: 0.0613687	total: 1.96s	remaining: 31.1s
17:	learn: 0.0576710	total: 2.14s	remaining: 32s
18:	learn: 0.0556813	total: 2.34s	remaining: 33s
19:	learn: 0.0521037	total: 2.54s	remaining: 33.9s
20:	learn: 0.0494182	total: 2.72s	remaining: 34.4s
21:	learn: 0.0474467	total: 2.9s	remaining: 35s
22:	learn: 0.0453413	total: 3.07s	remaining: 35.3s
23:	learn: 0.0432274	total: 3.26s	remaining: 35.7s
24:	learn: 0.0415857	total: 3.44s	remaining: 36.1s
25:	learn: 0.0398729	total: 3.62s	remaining: 36.3s
26:	learn: 0.0377164	total: 3.79s	remaining: 36.5s
27:	learn: 0.0366053	total: 3.96s	remaining: 36.6s
28:	learn: 0.0354201	total: 4.1s	remaining: 36.5s
29:	learn: 0.0338526	total: 4.28s	remaining: 36.7s
30:	learn: 0.0328743	total: 4.45s	remaining: 36.7s
31:	learn: 0.0320132	total: 4.61s	remaining: 36.7s
32:	learn: 0.0310875	total: 4.78s	remaining: 36.8s
33:	learn: 0.0301097	total: 4.96s	remaining: 36.9s
34:	learn: 0.0290178	total: 5.15s	remaining: 37.1s
35:	learn: 0.0281230	total: 5.36s	remaining: 37.4s
36:	learn: 0.0272684	total: 5.53s	remaining: 37.4s
37:	learn: 0.0265476	total: 5.7s	remaining: 37.3s
38:	learn: 0.0261047	total: 5.85s	remaining: 37.2s
39:	learn: 0.0254856	total: 6.02s	remaining: 37.2s
40:	learn: 0.0249074	total: 6.18s	remaining: 37.1s
41:	learn: 0.0242504	total: 6.37s	remaining: 37.1s
42:	learn: 0.0234220	total: 6.55s	remaining: 37.2s
43:	learn: 0.0225630	total: 6.74s	remaining: 37.2s
44:	learn: 0.0219382	total: 6.93s	remaining: 37.3s
45:	learn: 0.0215143	total: 7.06s	remaining: 37s
46:	learn: 0.0209992	total: 7.22s	remaining: 36.9s
47:	learn: 0.0206379	total: 7.42s	remaining: 37s
48:	learn: 0.0201580	total: 7.59s	remaining: 36.9s
49:	learn: 0.0196217	total: 7.73s	remaining: 36.6s
50:	learn: 0.0190020	total: 7.81s	remaining: 36.2s
51:	learn: 0.0187207	total: 7.9s	remaining: 35.7s
52:	learn: 0.0182975	total: 8s	remaining: 35.3s
53:	learn: 0.0179009	total: 8.09s	remaining: 34.9s
54:	learn: 0.0174870	total: 8.17s	remaining: 34.5s
55:	learn: 0.0170877	total: 8.28s	remaining: 34.2s
56:	learn: 0.0165851	total: 8.37s	remaining: 33.8s
57:	learn: 0.0163108	total: 8.46s	remaining: 33.4s
58:	learn: 0.0159184	total: 8.6s	remaining: 33.2s
59:	learn: 0.0155606	total: 8.7s	remaining: 32.9s
60:	learn: 0.0153157	total: 8.79s	remaining: 32.6s
61:	learn: 0.0148660	total: 8.91s	remaining: 32.3s
62:	learn: 0.0144198	total: 8.99s	remaining: 32s
63:	learn: 0.0141428	total: 9.08s	remaining: 31.6s
64:	learn: 0.0138543	total: 9.18s	remaining: 31.4s
65:	learn: 0.0136350	total: 9.26s	remaining: 31s
66:	learn: 0.0134320	total: 9.34s	remaining: 30.7s
67:	learn: 0.0132665	total: 9.45s	remaining: 30.4s
68:	learn: 0.0131074	total: 9.53s	remaining: 30.1s
69:	learn: 0.0128573	total: 9.62s	remaining: 29.8s
70:	learn: 0.0125637	total: 9.76s	remaining: 29.7s
71:	learn: 0.0122571	total: 9.85s	remaining: 29.4s
72:	learn: 0.0119056	total: 9.94s	remaining: 29.1s
73:	learn: 0.0117151	total: 10.1s	remaining: 28.9s
74:	learn: 0.0113432	total: 10.1s	remaining: 28.7s
75:	learn: 0.0111458	total: 10.2s	remaining: 28.4s
76:	learn: 0.0110900	total: 10.3s	remaining: 28.2s
77:	learn: 0.0109294	total: 10.4s	remaining: 27.9s
78:	learn: 0.0107211	total: 10.5s	remaining: 27.7s
79:	learn: 0.0104899	total: 10.6s	remaining: 27.5s
80:	learn: 0.0103732	total: 10.7s	remaining: 27.3s
81:	learn: 0.0101509	total: 10.8s	remaining: 27.1s
82:	learn: 0.0099860	total: 10.9s	remaining: 26.9s
83:	learn: 0.0097509	total: 11s	remaining: 26.6s
84:	learn: 0.0096771	total: 11.1s	remaining: 26.5s
85:	learn: 0.0094806	total: 11.2s	remaining: 26.3s
86:	learn: 0.0092620	total: 11.3s	remaining: 26s
87:	learn: 0.0091492	total: 11.4s	remaining: 25.8s
88:	learn: 0.0089352	total: 11.5s	remaining: 25.6s
89:	learn: 0.0087747	total: 11.6s	remaining: 25.4s
90:	learn: 0.0085671	total: 11.7s	remaining: 25.3s
91:	learn: 0.0084347	total: 11.9s	remaining: 25.1s
92:	learn: 0.0083372	total: 11.9s	remaining: 24.9s
93:	learn: 0.0082583	total: 12s	remaining: 24.7s
94:	learn: 0.0081975	total: 12.1s	remaining: 24.5s
95:	learn: 0.0079906	total: 12.2s	remaining: 24.3s
96:	learn: 0.0079103	total: 12.3s	remaining: 24.1s
97:	learn: 0.0078669	total: 12.4s	remaining: 23.9s
98:	learn: 0.0076441	total: 12.5s	remaining: 23.7s
99:	learn: 0.0074601	total: 12.6s	remaining: 23.6s
100:	learn: 0.0074188	total: 12.7s	remaining: 23.3s
101:	learn: 0.0072795	total: 12.8s	remaining: 23.2s
102:	learn: 0.0072249	total: 12.9s	remaining: 23.1s
103:	learn: 0.0070481	total: 13s	remaining: 22.9s
104:	learn: 0.0069876	total: 13.1s	remaining: 22.7s
105:	learn: 0.0069319	total: 13.2s	remaining: 22.5s
106:	learn: 0.0068644	total: 13.3s	remaining: 22.3s
107:	learn: 0.0067571	total: 13.4s	remaining: 22.2s
108:	learn: 0.0067370	total: 13.5s	remaining: 22s
109:	learn: 0.0067165	total: 13.5s	remaining: 21.8s
110:	learn: 0.0066032	total: 13.6s	remaining: 21.6s
111:	learn: 0.0065499	total: 13.7s	remaining: 21.5s
112:	learn: 0.0064729	total: 13.8s	remaining: 21.3s
113:	learn: 0.0064728	total: 13.9s	remaining: 21.1s
114:	learn: 0.0063233	total: 14s	remaining: 21s
115:	learn: 0.0062428	total: 14.1s	remaining: 20.8s
116:	learn: 0.0061974	total: 14.2s	remaining: 20.6s
117:	learn: 0.0061679	total: 14.3s	remaining: 20.5s
118:	learn: 0.0060404	total: 14.4s	remaining: 20.3s
119:	learn: 0.0059758	total: 14.5s	remaining: 20.1s
120:	learn: 0.0059131	total: 14.6s	remaining: 20s
121:	learn: 0.0059132	total: 14.7s	remaining: 19.8s
122:	learn: 0.0058114	total: 14.7s	remaining: 19.7s
123:	learn: 0.0057323	total: 14.9s	remaining: 19.5s
124:	learn: 0.0056416	total: 15s	remaining: 19.4s
125:	learn: 0.0055323	total: 15s	remaining: 19.2s
126:	learn: 0.0054684	total: 15.1s	remaining: 19.1s
127:	learn: 0.0053605	total: 15.2s	remaining: 18.9s
128:	learn: 0.0053368	total: 15.3s	remaining: 18.8s
129:	learn: 0.0052905	total: 15.4s	remaining: 18.6s
130:	learn: 0.0052381	total: 15.5s	remaining: 18.5s
131:	learn: 0.0051727	total: 15.6s	remaining: 18.3s
132:	learn: 0.0050939	total: 15.7s	remaining: 18.2s
133:	learn: 0.0050049	total: 15.8s	remaining: 18s
134:	learn: 0.0049264	total: 15.9s	remaining: 17.9s
135:	learn: 0.0049021	total: 16s	remaining: 17.8s
136:	learn: 0.0048911	total: 16.1s	remaining: 17.6s
137:	learn: 0.0048588	total: 16.2s	remaining: 17.5s
138:	learn: 0.0048090	total: 16.3s	remaining: 17.3s
139:	learn: 0.0047756	total: 16.4s	remaining: 17.2s
140:	learn: 0.0046978	total: 16.4s	remaining: 17s
141:	learn: 0.0046867	total: 16.5s	remaining: 16.9s
142:	learn: 0.0046348	total: 16.6s	remaining: 16.7s
143:	learn: 0.0045885	total: 16.7s	remaining: 16.6s
144:	learn: 0.0045769	total: 16.8s	remaining: 16.5s
145:	learn: 0.0045584	total: 16.9s	remaining: 16.3s
146:	learn: 0.0045503	total: 17s	remaining: 16.2s
147:	learn: 0.0044674	total: 17.1s	remaining: 16.1s
148:	learn: 0.0044234	total: 17.2s	remaining: 16s
149:	learn: 0.0043621	total: 17.3s	remaining: 15.8s
150:	learn: 0.0043050	total: 17.4s	remaining: 15.7s
151:	learn: 0.0042743	total: 17.5s	remaining: 15.6s
152:	learn: 0.0042359	total: 17.6s	remaining: 15.4s
153:	learn: 0.0042026	total: 17.8s	remaining: 15.4s
154:	learn: 0.0041727	total: 17.9s	remaining: 15.3s
155:	learn: 0.0041510	total: 18.1s	remaining: 15.2s
156:	learn: 0.0041141	total: 18.3s	remaining: 15.1s
157:	learn: 0.0040617	total: 18.4s	remaining: 15.1s
158:	learn: 0.0040163	total: 18.6s	remaining: 15s
159:	learn: 0.0039882	total: 18.8s	remaining: 14.9s
160:	learn: 0.0039334	total: 18.9s	remaining: 14.8s
161:	learn: 0.0039146	total: 19.1s	remaining: 14.8s
162:	learn: 0.0039104	total: 19.3s	remaining: 14.7s
163:	learn: 0.0038611	total: 19.4s	remaining: 14.6s
164:	learn: 0.0038611	total: 19.6s	remaining: 14.5s
165:	learn: 0.0038367	total: 19.7s	remaining: 14.4s
166:	learn: 0.0037978	total: 19.9s	remaining: 14.3s
167:	learn: 0.0037426	total: 20.1s	remaining: 14.2s
168:	learn: 0.0036704	total: 20.2s	remaining: 14.1s
169:	learn: 0.0036704	total: 20.4s	remaining: 14s
170:	learn: 0.0036531	total: 20.5s	remaining: 13.9s
171:	learn: 0.0036364	total: 20.7s	remaining: 13.9s
172:	learn: 0.0036364	total: 20.9s	remaining: 13.7s
173:	learn: 0.0036365	total: 21s	remaining: 13.6s
174:	learn: 0.0036364	total: 21.2s	remaining: 13.5s
175:	learn: 0.0036364	total: 21.3s	remaining: 13.4s
176:	learn: 0.0036364	total: 21.4s	remaining: 13.3s
177:	learn: 0.0036364	total: 21.6s	remaining: 13.2s
178:	learn: 0.0036364	total: 21.7s	remaining: 13.1s
179:	learn: 0.0036364	total: 21.8s	remaining: 13s
180:	learn: 0.0036364	total: 22s	remaining: 12.9s
181:	learn: 0.0036364	total: 22.1s	remaining: 12.8s
182:	learn: 0.0036364	total: 22.2s	remaining: 12.6s
183:	learn: 0.0036364	total: 22.4s	remaining: 12.5s
184:	learn: 0.0036364	total: 22.5s	remaining: 12.4s
185:	learn: 0.0036364	total: 22.7s	remaining: 12.3s
186:	learn: 0.0036364	total: 22.8s	remaining: 12.2s
187:	learn: 0.0036364	total: 22.9s	remaining: 12.1s
188:	learn: 0.0036364	total: 23.1s	remaining: 12s
189:	learn: 0.0036364	total: 23.2s	remaining: 11.9s
190:	learn: 0.0036364	total: 23.3s	remaining: 11.7s
191:	learn: 0.0036364	total: 23.5s	remaining: 11.6s
192:	learn: 0.0036364	total: 23.6s	remaining: 11.5s
193:	learn: 0.0036364	total: 23.8s	remaining: 11.4s
194:	learn: 0.0036364	total: 23.9s	remaining: 11.3s
195:	learn: 0.0036364	total: 23.9s	remaining: 11.1s
196:	learn: 0.0036364	total: 24s	remaining: 11s
197:	learn: 0.0036364	total: 24.1s	remaining: 10.8s
198:	learn: 0.0036364	total: 24.2s	remaining: 10.7s
199:	learn: 0.0036364	total: 24.2s	remaining: 10.5s
200:	learn: 0.0036364	total: 24.3s	remaining: 10.4s
201:	learn: 0.0036364	total: 24.4s	remaining: 10.3s
202:	learn: 0.0036364	total: 24.5s	remaining: 10.1s
203:	learn: 0.0036364	total: 24.5s	remaining: 9.99s
204:	learn: 0.0036364	total: 24.6s	remaining: 9.85s
205:	learn: 0.0036364	total: 24.7s	remaining: 9.71s
206:	learn: 0.0036364	total: 24.8s	remaining: 9.57s
207:	learn: 0.0036364	total: 24.8s	remaining: 9.44s
208:	learn: 0.0036364	total: 24.9s	remaining: 9.3s
209:	learn: 0.0036364	total: 25s	remaining: 9.17s
210:	learn: 0.0036364	total: 25.1s	remaining: 9.03s
211:	learn: 0.0036364	total: 25.1s	remaining: 8.89s
212:	learn: 0.0036364	total: 25.2s	remaining: 8.77s
213:	learn: 0.0036364	total: 25.3s	remaining: 8.64s
214:	learn: 0.0036364	total: 25.4s	remaining: 8.51s
215:	learn: 0.0036364	total: 25.5s	remaining: 8.38s
216:	learn: 0.0036364	total: 25.6s	remaining: 8.25s
217:	learn: 0.0036364	total: 25.6s	remaining: 8.12s
218:	learn: 0.0036364	total: 25.7s	remaining: 7.99s
219:	learn: 0.0036364	total: 25.8s	remaining: 7.86s
220:	learn: 0.0036364	total: 25.9s	remaining: 7.73s
221:	learn: 0.0036364	total: 26s	remaining: 7.6s
222:	learn: 0.0036364	total: 26s	remaining: 7.47s
223:	learn: 0.0036364	total: 26.1s	remaining: 7.34s
224:	learn: 0.0036364	total: 26.2s	remaining: 7.21s
225:	learn: 0.0036364	total: 26.3s	remaining: 7.09s
226:	learn: 0.0036364	total: 26.3s	remaining: 6.96s
227:	learn: 0.0036364	total: 26.4s	remaining: 6.83s
228:	learn: 0.0036364	total: 26.5s	remaining: 6.71s
229:	learn: 0.0036364	total: 26.6s	remaining: 6.58s
230:	learn: 0.0036364	total: 26.7s	remaining: 6.46s
231:	learn: 0.0036364	total: 26.7s	remaining: 6.34s
232:	learn: 0.0036364	total: 26.8s	remaining: 6.21s
233:	learn: 0.0036364	total: 26.9s	remaining: 6.09s
234:	learn: 0.0036364	total: 27s	remaining: 5.97s
235:	learn: 0.0036364	total: 27s	remaining: 5.84s
236:	learn: 0.0036364	total: 27.1s	remaining: 5.72s
237:	learn: 0.0036364	total: 27.2s	remaining: 5.6s
238:	learn: 0.0036364	total: 27.3s	remaining: 5.47s
239:	learn: 0.0036364	total: 27.3s	remaining: 5.35s
240:	learn: 0.0036364	total: 27.4s	remaining: 5.23s
241:	learn: 0.0036364	total: 27.5s	remaining: 5.11s
242:	learn: 0.0036364	total: 27.6s	remaining: 4.99s
243:	learn: 0.0036364	total: 27.6s	remaining: 4.87s
244:	learn: 0.0036364	total: 27.7s	remaining: 4.75s
245:	learn: 0.0036364	total: 27.8s	remaining: 4.63s
246:	learn: 0.0036364	total: 27.9s	remaining: 4.51s
247:	learn: 0.0036364	total: 27.9s	remaining: 4.39s
248:	learn: 0.0036364	total: 28s	remaining: 4.28s
249:	learn: 0.0036364	total: 28.1s	remaining: 4.16s
250:	learn: 0.0036364	total: 28.2s	remaining: 4.04s
251:	learn: 0.0036364	total: 28.2s	remaining: 3.92s
252:	learn: 0.0036364	total: 28.3s	remaining: 3.81s
253:	learn: 0.0036364	total: 28.4s	remaining: 3.69s
254:	learn: 0.0036364	total: 28.5s	remaining: 3.58s
255:	learn: 0.0036364	total: 28.6s	remaining: 3.46s
256:	learn: 0.0036364	total: 28.7s	remaining: 3.35s
257:	learn: 0.0036364	total: 28.8s	remaining: 3.23s
258:	learn: 0.0035897	total: 28.9s	remaining: 3.12s
259:	learn: 0.0035897	total: 28.9s	remaining: 3s
260:	learn: 0.0035897	total: 29s	remaining: 2.89s
261:	learn: 0.0035897	total: 29.1s	remaining: 2.77s
262:	learn: 0.0035897	total: 29.1s	remaining: 2.66s
263:	learn: 0.0035897	total: 29.2s	remaining: 2.54s
264:	learn: 0.0035897	total: 29.3s	remaining: 2.43s
265:	learn: 0.0035897	total: 29.4s	remaining: 2.32s
266:	learn: 0.0035897	total: 29.5s	remaining: 2.21s
267:	learn: 0.0035897	total: 29.5s	remaining: 2.09s
268:	learn: 0.0035897	total: 29.6s	remaining: 1.98s
269:	learn: 0.0035897	total: 29.7s	remaining: 1.87s
270:	learn: 0.0035897	total: 29.8s	remaining: 1.76s
271:	learn: 0.0035897	total: 29.8s	remaining: 1.65s
272:	learn: 0.0035897	total: 29.9s	remaining: 1.53s
273:	learn: 0.0035896	total: 30s	remaining: 1.42s
274:	learn: 0.0035896	total: 30.1s	remaining: 1.31s
275:	learn: 0.0035896	total: 30.2s	remaining: 1.2s
276:	learn: 0.0035758	total: 30.2s	remaining: 1.09s
277:	learn: 0.0035758	total: 30.3s	remaining: 981ms
278:	learn: 0.0035758	total: 30.4s	remaining: 871ms
279:	learn: 0.0035758	total: 30.5s	remaining: 762ms
280:	learn: 0.0035758	total: 30.5s	remaining: 652ms
281:	learn: 0.0035758	total: 30.6s	remaining: 543ms
282:	learn: 0.0035758	total: 30.7s	remaining: 434ms
283:	learn: 0.0035758	total: 30.8s	remaining: 325ms
284:	learn: 0.0035758	total: 30.9s	remaining: 217ms
285:	learn: 0.0035758	total: 30.9s	remaining: 108ms
286:	learn: 0.0035758	total: 31s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.64
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.82
 - Precision_Test: 25.70
 - Recall_Test: 87.30
 - AUPRC_Test: 79.99
 - Accuracy_Test: 99.55
 - F1-Score_Test: 39.71
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 287
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5022611	total: 82.4ms	remaining: 23.6s
1:	learn: 0.3436314	total: 167ms	remaining: 23.8s
2:	learn: 0.2471775	total: 254ms	remaining: 24.1s
3:	learn: 0.1822348	total: 398ms	remaining: 28.1s
4:	learn: 0.1528240	total: 489ms	remaining: 27.6s
5:	learn: 0.1312156	total: 580ms	remaining: 27.1s
6:	learn: 0.1153858	total: 738ms	remaining: 29.5s
7:	learn: 0.1049096	total: 888ms	remaining: 31s
8:	learn: 0.0942542	total: 1.06s	remaining: 32.7s
9:	learn: 0.0849985	total: 1.21s	remaining: 33.4s
10:	learn: 0.0786457	total: 1.38s	remaining: 34.7s
11:	learn: 0.0737591	total: 1.54s	remaining: 35.4s
12:	learn: 0.0673743	total: 1.74s	remaining: 36.6s
13:	learn: 0.0636635	total: 1.92s	remaining: 37.4s
14:	learn: 0.0592519	total: 2.11s	remaining: 38.3s
15:	learn: 0.0564617	total: 2.29s	remaining: 38.8s
16:	learn: 0.0540906	total: 2.47s	remaining: 39.3s
17:	learn: 0.0509535	total: 2.65s	remaining: 39.6s
18:	learn: 0.0479886	total: 2.83s	remaining: 39.9s
19:	learn: 0.0456754	total: 3.01s	remaining: 40.1s
20:	learn: 0.0441347	total: 3.19s	remaining: 40.4s
21:	learn: 0.0420458	total: 3.35s	remaining: 40.4s
22:	learn: 0.0404712	total: 3.55s	remaining: 40.8s
23:	learn: 0.0391960	total: 3.72s	remaining: 40.7s
24:	learn: 0.0371726	total: 3.93s	remaining: 41.2s
25:	learn: 0.0360292	total: 4.1s	remaining: 41.1s
26:	learn: 0.0349088	total: 4.27s	remaining: 41.1s
27:	learn: 0.0330837	total: 4.45s	remaining: 41.2s
28:	learn: 0.0318252	total: 4.64s	remaining: 41.3s
29:	learn: 0.0307720	total: 4.82s	remaining: 41.3s
30:	learn: 0.0293833	total: 5.01s	remaining: 41.4s
31:	learn: 0.0286983	total: 5.18s	remaining: 41.3s
32:	learn: 0.0274497	total: 5.37s	remaining: 41.3s
33:	learn: 0.0266203	total: 5.54s	remaining: 41.2s
34:	learn: 0.0260970	total: 5.71s	remaining: 41.1s
35:	learn: 0.0250726	total: 5.88s	remaining: 41s
36:	learn: 0.0241124	total: 6.06s	remaining: 40.9s
37:	learn: 0.0237167	total: 6.21s	remaining: 40.7s
38:	learn: 0.0232003	total: 6.38s	remaining: 40.5s
39:	learn: 0.0224425	total: 6.55s	remaining: 40.5s
40:	learn: 0.0215764	total: 6.7s	remaining: 40.2s
41:	learn: 0.0209866	total: 6.8s	remaining: 39.7s
42:	learn: 0.0202157	total: 6.89s	remaining: 39.1s
43:	learn: 0.0196116	total: 7.01s	remaining: 38.7s
44:	learn: 0.0192906	total: 7.09s	remaining: 38.1s
45:	learn: 0.0185428	total: 7.18s	remaining: 37.6s
46:	learn: 0.0179712	total: 7.3s	remaining: 37.3s
47:	learn: 0.0176166	total: 7.38s	remaining: 36.7s
48:	learn: 0.0172913	total: 7.48s	remaining: 36.3s
49:	learn: 0.0170474	total: 7.58s	remaining: 35.9s
50:	learn: 0.0167332	total: 7.67s	remaining: 35.5s
51:	learn: 0.0164757	total: 7.75s	remaining: 35s
52:	learn: 0.0160355	total: 7.88s	remaining: 34.8s
53:	learn: 0.0157484	total: 7.97s	remaining: 34.4s
54:	learn: 0.0153402	total: 8.08s	remaining: 34.1s
55:	learn: 0.0148455	total: 8.19s	remaining: 33.8s
56:	learn: 0.0144047	total: 8.28s	remaining: 33.4s
57:	learn: 0.0140017	total: 8.37s	remaining: 33s
58:	learn: 0.0136414	total: 8.49s	remaining: 32.8s
59:	learn: 0.0134646	total: 8.57s	remaining: 32.4s
60:	learn: 0.0131841	total: 8.66s	remaining: 32.1s
61:	learn: 0.0127154	total: 8.78s	remaining: 31.9s
62:	learn: 0.0123991	total: 8.87s	remaining: 31.5s
63:	learn: 0.0121776	total: 8.97s	remaining: 31.3s
64:	learn: 0.0118638	total: 9.09s	remaining: 31s
65:	learn: 0.0116631	total: 9.18s	remaining: 30.7s
66:	learn: 0.0114278	total: 9.26s	remaining: 30.4s
67:	learn: 0.0112710	total: 9.38s	remaining: 30.2s
68:	learn: 0.0109965	total: 9.48s	remaining: 29.9s
69:	learn: 0.0108006	total: 9.56s	remaining: 29.6s
70:	learn: 0.0104270	total: 9.71s	remaining: 29.5s
71:	learn: 0.0101996	total: 9.8s	remaining: 29.3s
72:	learn: 0.0099587	total: 9.88s	remaining: 29s
73:	learn: 0.0098390	total: 10s	remaining: 28.8s
74:	learn: 0.0097218	total: 10.1s	remaining: 28.5s
75:	learn: 0.0095602	total: 10.2s	remaining: 28.3s
76:	learn: 0.0092593	total: 10.3s	remaining: 28.1s
77:	learn: 0.0091076	total: 10.4s	remaining: 27.8s
78:	learn: 0.0088817	total: 10.5s	remaining: 27.6s
79:	learn: 0.0087389	total: 10.6s	remaining: 27.4s
80:	learn: 0.0086792	total: 10.7s	remaining: 27.1s
81:	learn: 0.0084787	total: 10.8s	remaining: 26.9s
82:	learn: 0.0084298	total: 10.9s	remaining: 26.7s
83:	learn: 0.0082536	total: 11s	remaining: 26.5s
84:	learn: 0.0080879	total: 11.1s	remaining: 26.3s
85:	learn: 0.0079880	total: 11.2s	remaining: 26.1s
86:	learn: 0.0078327	total: 11.3s	remaining: 25.9s
87:	learn: 0.0077280	total: 11.3s	remaining: 25.7s
88:	learn: 0.0075602	total: 11.5s	remaining: 25.5s
89:	learn: 0.0074178	total: 11.6s	remaining: 25.3s
90:	learn: 0.0072450	total: 11.7s	remaining: 25.1s
91:	learn: 0.0071202	total: 11.8s	remaining: 24.9s
92:	learn: 0.0070592	total: 11.9s	remaining: 24.7s
93:	learn: 0.0069522	total: 11.9s	remaining: 24.5s
94:	learn: 0.0068361	total: 12.1s	remaining: 24.4s
95:	learn: 0.0066771	total: 12.1s	remaining: 24.2s
96:	learn: 0.0064859	total: 12.2s	remaining: 24s
97:	learn: 0.0064477	total: 12.3s	remaining: 23.8s
98:	learn: 0.0063505	total: 12.4s	remaining: 23.6s
99:	learn: 0.0062773	total: 12.5s	remaining: 23.4s
100:	learn: 0.0061486	total: 12.6s	remaining: 23.3s
101:	learn: 0.0060916	total: 12.7s	remaining: 23.1s
102:	learn: 0.0060688	total: 12.8s	remaining: 22.9s
103:	learn: 0.0059386	total: 12.9s	remaining: 22.7s
104:	learn: 0.0058522	total: 13s	remaining: 22.5s
105:	learn: 0.0057461	total: 13.1s	remaining: 22.4s
106:	learn: 0.0056056	total: 13.2s	remaining: 22.3s
107:	learn: 0.0054491	total: 13.3s	remaining: 22.1s
108:	learn: 0.0053869	total: 13.4s	remaining: 21.9s
109:	learn: 0.0052851	total: 13.5s	remaining: 21.8s
110:	learn: 0.0052482	total: 13.6s	remaining: 21.6s
111:	learn: 0.0051706	total: 13.7s	remaining: 21.4s
112:	learn: 0.0050724	total: 13.8s	remaining: 21.3s
113:	learn: 0.0049952	total: 13.9s	remaining: 21.1s
114:	learn: 0.0049185	total: 14s	remaining: 21s
115:	learn: 0.0048192	total: 14.2s	remaining: 20.9s
116:	learn: 0.0047867	total: 14.2s	remaining: 20.7s
117:	learn: 0.0047344	total: 14.3s	remaining: 20.5s
118:	learn: 0.0046758	total: 14.4s	remaining: 20.4s
119:	learn: 0.0045572	total: 14.5s	remaining: 20.2s
120:	learn: 0.0044959	total: 14.6s	remaining: 20.1s
121:	learn: 0.0044595	total: 14.7s	remaining: 19.9s
122:	learn: 0.0044351	total: 14.8s	remaining: 19.7s
123:	learn: 0.0043703	total: 14.9s	remaining: 19.6s
124:	learn: 0.0042911	total: 15s	remaining: 19.4s
125:	learn: 0.0042468	total: 15.1s	remaining: 19.3s
126:	learn: 0.0042077	total: 15.2s	remaining: 19.1s
127:	learn: 0.0041611	total: 15.3s	remaining: 19s
128:	learn: 0.0041105	total: 15.4s	remaining: 18.8s
129:	learn: 0.0040525	total: 15.5s	remaining: 18.7s
130:	learn: 0.0040108	total: 15.6s	remaining: 18.5s
131:	learn: 0.0039534	total: 15.7s	remaining: 18.4s
132:	learn: 0.0039533	total: 15.7s	remaining: 18.2s
133:	learn: 0.0039533	total: 15.8s	remaining: 18s
134:	learn: 0.0038881	total: 15.9s	remaining: 17.9s
135:	learn: 0.0038520	total: 16s	remaining: 17.8s
136:	learn: 0.0038224	total: 16.1s	remaining: 17.6s
137:	learn: 0.0038038	total: 16.2s	remaining: 17.5s
138:	learn: 0.0037684	total: 16.3s	remaining: 17.3s
139:	learn: 0.0037382	total: 16.4s	remaining: 17.2s
140:	learn: 0.0037126	total: 16.5s	remaining: 17.1s
141:	learn: 0.0036616	total: 16.6s	remaining: 16.9s
142:	learn: 0.0036480	total: 16.7s	remaining: 16.8s
143:	learn: 0.0036253	total: 16.8s	remaining: 16.7s
144:	learn: 0.0035846	total: 17s	remaining: 16.6s
145:	learn: 0.0035600	total: 17.2s	remaining: 16.6s
146:	learn: 0.0035360	total: 17.3s	remaining: 16.5s
147:	learn: 0.0034845	total: 17.5s	remaining: 16.4s
148:	learn: 0.0034703	total: 17.7s	remaining: 16.4s
149:	learn: 0.0034582	total: 17.8s	remaining: 16.3s
150:	learn: 0.0034491	total: 18s	remaining: 16.2s
151:	learn: 0.0034009	total: 18.2s	remaining: 16.1s
152:	learn: 0.0034009	total: 18.3s	remaining: 16s
153:	learn: 0.0034009	total: 18.4s	remaining: 15.9s
154:	learn: 0.0033802	total: 18.6s	remaining: 15.9s
155:	learn: 0.0033526	total: 18.8s	remaining: 15.8s
156:	learn: 0.0033526	total: 18.9s	remaining: 15.7s
157:	learn: 0.0033113	total: 19.1s	remaining: 15.6s
158:	learn: 0.0032953	total: 19.3s	remaining: 15.5s
159:	learn: 0.0032618	total: 19.5s	remaining: 15.4s
160:	learn: 0.0032259	total: 19.6s	remaining: 15.4s
161:	learn: 0.0031986	total: 19.8s	remaining: 15.3s
162:	learn: 0.0031862	total: 20s	remaining: 15.2s
163:	learn: 0.0031537	total: 20.1s	remaining: 15.1s
164:	learn: 0.0031211	total: 20.3s	remaining: 15s
165:	learn: 0.0031211	total: 20.5s	remaining: 14.9s
166:	learn: 0.0030933	total: 20.6s	remaining: 14.8s
167:	learn: 0.0030932	total: 20.8s	remaining: 14.7s
168:	learn: 0.0030932	total: 20.9s	remaining: 14.6s
169:	learn: 0.0030784	total: 21.1s	remaining: 14.5s
170:	learn: 0.0030517	total: 21.3s	remaining: 14.4s
171:	learn: 0.0030517	total: 21.4s	remaining: 14.3s
172:	learn: 0.0030308	total: 21.6s	remaining: 14.2s
173:	learn: 0.0030140	total: 21.8s	remaining: 14.1s
174:	learn: 0.0030140	total: 21.9s	remaining: 14s
175:	learn: 0.0030140	total: 22s	remaining: 13.9s
176:	learn: 0.0029912	total: 22.2s	remaining: 13.8s
177:	learn: 0.0029671	total: 22.4s	remaining: 13.7s
178:	learn: 0.0029311	total: 22.5s	remaining: 13.6s
179:	learn: 0.0029311	total: 22.6s	remaining: 13.4s
180:	learn: 0.0029192	total: 22.6s	remaining: 13.3s
181:	learn: 0.0029192	total: 22.7s	remaining: 13.1s
182:	learn: 0.0028876	total: 22.8s	remaining: 13s
183:	learn: 0.0028606	total: 22.9s	remaining: 12.8s
184:	learn: 0.0028606	total: 23s	remaining: 12.7s
185:	learn: 0.0028606	total: 23.1s	remaining: 12.5s
186:	learn: 0.0028348	total: 23.2s	remaining: 12.4s
187:	learn: 0.0028058	total: 23.3s	remaining: 12.3s
188:	learn: 0.0027821	total: 23.4s	remaining: 12.1s
189:	learn: 0.0027821	total: 23.4s	remaining: 12s
190:	learn: 0.0027821	total: 23.5s	remaining: 11.8s
191:	learn: 0.0027821	total: 23.6s	remaining: 11.7s
192:	learn: 0.0027821	total: 23.7s	remaining: 11.5s
193:	learn: 0.0027821	total: 23.7s	remaining: 11.4s
194:	learn: 0.0027821	total: 23.8s	remaining: 11.2s
195:	learn: 0.0027821	total: 23.9s	remaining: 11.1s
196:	learn: 0.0027821	total: 24s	remaining: 11s
197:	learn: 0.0027821	total: 24.1s	remaining: 10.8s
198:	learn: 0.0027821	total: 24.1s	remaining: 10.7s
199:	learn: 0.0027821	total: 24.2s	remaining: 10.5s
200:	learn: 0.0027821	total: 24.3s	remaining: 10.4s
201:	learn: 0.0027821	total: 24.3s	remaining: 10.2s
202:	learn: 0.0027820	total: 24.4s	remaining: 10.1s
203:	learn: 0.0027821	total: 24.5s	remaining: 9.97s
204:	learn: 0.0027821	total: 24.6s	remaining: 9.84s
205:	learn: 0.0027821	total: 24.7s	remaining: 9.7s
206:	learn: 0.0027820	total: 24.7s	remaining: 9.56s
207:	learn: 0.0027820	total: 24.8s	remaining: 9.43s
208:	learn: 0.0027820	total: 24.9s	remaining: 9.29s
209:	learn: 0.0027820	total: 25s	remaining: 9.16s
210:	learn: 0.0027820	total: 25s	remaining: 9.02s
211:	learn: 0.0027820	total: 25.1s	remaining: 8.89s
212:	learn: 0.0027820	total: 25.2s	remaining: 8.76s
213:	learn: 0.0027820	total: 25.3s	remaining: 8.62s
214:	learn: 0.0027820	total: 25.3s	remaining: 8.48s
215:	learn: 0.0027820	total: 25.4s	remaining: 8.36s
216:	learn: 0.0027820	total: 25.5s	remaining: 8.22s
217:	learn: 0.0027820	total: 25.6s	remaining: 8.1s
218:	learn: 0.0027820	total: 25.7s	remaining: 7.97s
219:	learn: 0.0027820	total: 25.7s	remaining: 7.84s
220:	learn: 0.0027820	total: 25.8s	remaining: 7.71s
221:	learn: 0.0027820	total: 25.9s	remaining: 7.58s
222:	learn: 0.0027820	total: 26s	remaining: 7.45s
223:	learn: 0.0027820	total: 26s	remaining: 7.32s
224:	learn: 0.0027820	total: 26.1s	remaining: 7.2s
225:	learn: 0.0027820	total: 26.2s	remaining: 7.07s
226:	learn: 0.0027820	total: 26.3s	remaining: 6.95s
227:	learn: 0.0027820	total: 26.4s	remaining: 6.83s
228:	learn: 0.0027819	total: 26.4s	remaining: 6.7s
229:	learn: 0.0027819	total: 26.5s	remaining: 6.57s
230:	learn: 0.0027819	total: 26.6s	remaining: 6.45s
231:	learn: 0.0027819	total: 26.7s	remaining: 6.32s
232:	learn: 0.0027819	total: 26.8s	remaining: 6.2s
233:	learn: 0.0027819	total: 26.8s	remaining: 6.08s
234:	learn: 0.0027819	total: 26.9s	remaining: 5.95s
235:	learn: 0.0027819	total: 27s	remaining: 5.83s
236:	learn: 0.0027819	total: 27.1s	remaining: 5.71s
237:	learn: 0.0027819	total: 27.1s	remaining: 5.58s
238:	learn: 0.0027819	total: 27.2s	remaining: 5.46s
239:	learn: 0.0027819	total: 27.3s	remaining: 5.34s
240:	learn: 0.0027819	total: 27.4s	remaining: 5.22s
241:	learn: 0.0027819	total: 27.4s	remaining: 5.1s
242:	learn: 0.0027819	total: 27.5s	remaining: 4.98s
243:	learn: 0.0027819	total: 27.6s	remaining: 4.86s
244:	learn: 0.0027819	total: 27.7s	remaining: 4.74s
245:	learn: 0.0027819	total: 27.7s	remaining: 4.62s
246:	learn: 0.0027819	total: 27.8s	remaining: 4.5s
247:	learn: 0.0027819	total: 27.9s	remaining: 4.38s
248:	learn: 0.0027819	total: 27.9s	remaining: 4.26s
249:	learn: 0.0027819	total: 28s	remaining: 4.15s
250:	learn: 0.0027819	total: 28.1s	remaining: 4.03s
251:	learn: 0.0027819	total: 28.2s	remaining: 3.91s
252:	learn: 0.0027819	total: 28.3s	remaining: 3.8s
253:	learn: 0.0027819	total: 28.3s	remaining: 3.68s
254:	learn: 0.0027819	total: 28.4s	remaining: 3.56s
255:	learn: 0.0027819	total: 28.5s	remaining: 3.45s
256:	learn: 0.0027819	total: 28.6s	remaining: 3.33s
257:	learn: 0.0027819	total: 28.6s	remaining: 3.22s
258:	learn: 0.0027819	total: 28.7s	remaining: 3.1s
259:	learn: 0.0027819	total: 28.8s	remaining: 2.99s
260:	learn: 0.0027819	total: 28.9s	remaining: 2.88s
261:	learn: 0.0027818	total: 28.9s	remaining: 2.76s
262:	learn: 0.0027818	total: 29s	remaining: 2.65s
263:	learn: 0.0027819	total: 29.1s	remaining: 2.53s
264:	learn: 0.0027819	total: 29.2s	remaining: 2.42s
265:	learn: 0.0027819	total: 29.2s	remaining: 2.31s
266:	learn: 0.0027819	total: 29.3s	remaining: 2.19s
267:	learn: 0.0027818	total: 29.4s	remaining: 2.08s
268:	learn: 0.0027819	total: 29.5s	remaining: 1.97s
269:	learn: 0.0027819	total: 29.5s	remaining: 1.86s
270:	learn: 0.0027819	total: 29.6s	remaining: 1.75s
271:	learn: 0.0027818	total: 29.7s	remaining: 1.64s
272:	learn: 0.0027819	total: 29.8s	remaining: 1.53s
273:	learn: 0.0027819	total: 29.9s	remaining: 1.42s
274:	learn: 0.0027818	total: 30s	remaining: 1.31s
275:	learn: 0.0027818	total: 30s	remaining: 1.2s
276:	learn: 0.0027818	total: 30.1s	remaining: 1.09s
277:	learn: 0.0027818	total: 30.2s	remaining: 977ms
278:	learn: 0.0027818	total: 30.3s	remaining: 867ms
279:	learn: 0.0027818	total: 30.3s	remaining: 758ms
280:	learn: 0.0027818	total: 30.4s	remaining: 649ms
281:	learn: 0.0027818	total: 30.5s	remaining: 540ms
282:	learn: 0.0027819	total: 30.6s	remaining: 432ms
283:	learn: 0.0027818	total: 30.6s	remaining: 324ms
284:	learn: 0.0027818	total: 30.7s	remaining: 215ms
285:	learn: 0.0027818	total: 30.8s	remaining: 108ms
286:	learn: 0.0027818	total: 30.9s	remaining: 0us
[I 2024-12-19 14:58:56,167] Trial 39 finished with value: 79.6702070023647 and parameters: {'learning_rate': 0.08714503523305407, 'max_depth': 6, 'n_estimators': 287, 'scale_pos_weight': 6.216876482022162}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.74
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.87
 - F1-Score_Train: 99.87
 - Precision_Test: 30.59
 - Recall_Test: 85.71
 - AUPRC_Test: 78.56
 - Accuracy_Test: 99.65
 - F1-Score_Test: 45.09
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 287
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.22
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 79.6702

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5530034	total: 186ms	remaining: 53s
1:	learn: 0.4264755	total: 370ms	remaining: 52.5s
2:	learn: 0.3374282	total: 583ms	remaining: 55s
3:	learn: 0.2669426	total: 746ms	remaining: 52.6s
4:	learn: 0.2104889	total: 942ms	remaining: 53s
5:	learn: 0.1738978	total: 1.1s	remaining: 51.3s
6:	learn: 0.1450735	total: 1.26s	remaining: 50.1s
7:	learn: 0.1229085	total: 1.44s	remaining: 50s
8:	learn: 0.1070658	total: 1.63s	remaining: 50.3s
9:	learn: 0.0959231	total: 1.81s	remaining: 50s
10:	learn: 0.0878004	total: 2.01s	remaining: 50.2s
11:	learn: 0.0800326	total: 2.18s	remaining: 49.8s
12:	learn: 0.0746995	total: 2.4s	remaining: 50.4s
13:	learn: 0.0678253	total: 2.56s	remaining: 49.7s
14:	learn: 0.0642536	total: 2.72s	remaining: 49.1s
15:	learn: 0.0591166	total: 2.89s	remaining: 48.7s
16:	learn: 0.0552595	total: 3.06s	remaining: 48.4s
17:	learn: 0.0526364	total: 3.23s	remaining: 48.1s
18:	learn: 0.0495844	total: 3.42s	remaining: 48.1s
19:	learn: 0.0464977	total: 3.63s	remaining: 48.3s
20:	learn: 0.0443209	total: 3.81s	remaining: 48.1s
21:	learn: 0.0428798	total: 4s	remaining: 48s
22:	learn: 0.0409769	total: 4.18s	remaining: 47.8s
23:	learn: 0.0397084	total: 4.35s	remaining: 47.5s
24:	learn: 0.0381022	total: 4.45s	remaining: 46.4s
25:	learn: 0.0367996	total: 4.54s	remaining: 45.4s
26:	learn: 0.0358515	total: 4.65s	remaining: 44.6s
27:	learn: 0.0344010	total: 4.75s	remaining: 43.8s
28:	learn: 0.0332368	total: 4.85s	remaining: 43s
29:	learn: 0.0324600	total: 4.95s	remaining: 42.3s
30:	learn: 0.0313996	total: 5.04s	remaining: 41.4s
31:	learn: 0.0305304	total: 5.15s	remaining: 40.9s
32:	learn: 0.0294587	total: 5.25s	remaining: 40.2s
33:	learn: 0.0286343	total: 5.33s	remaining: 39.5s
34:	learn: 0.0279628	total: 5.44s	remaining: 39s
35:	learn: 0.0271947	total: 5.52s	remaining: 38.3s
36:	learn: 0.0265241	total: 5.6s	remaining: 37.7s
37:	learn: 0.0258419	total: 5.71s	remaining: 37.3s
38:	learn: 0.0251613	total: 5.82s	remaining: 36.9s
39:	learn: 0.0243426	total: 5.91s	remaining: 36.4s
40:	learn: 0.0238720	total: 6.01s	remaining: 35.9s
41:	learn: 0.0233524	total: 6.1s	remaining: 35.5s
42:	learn: 0.0227219	total: 6.21s	remaining: 35.1s
43:	learn: 0.0221756	total: 6.3s	remaining: 34.7s
44:	learn: 0.0215696	total: 6.39s	remaining: 34.2s
45:	learn: 0.0212296	total: 6.5s	remaining: 33.9s
46:	learn: 0.0207677	total: 6.58s	remaining: 33.5s
47:	learn: 0.0204103	total: 6.67s	remaining: 33.1s
48:	learn: 0.0198789	total: 6.83s	remaining: 33s
49:	learn: 0.0195372	total: 6.92s	remaining: 32.7s
50:	learn: 0.0191982	total: 7.01s	remaining: 32.3s
51:	learn: 0.0189260	total: 7.12s	remaining: 32s
52:	learn: 0.0185335	total: 7.2s	remaining: 31.6s
53:	learn: 0.0182766	total: 7.28s	remaining: 31.3s
54:	learn: 0.0179160	total: 7.39s	remaining: 31s
55:	learn: 0.0175114	total: 7.48s	remaining: 30.7s
56:	learn: 0.0172050	total: 7.57s	remaining: 30.4s
57:	learn: 0.0168237	total: 7.69s	remaining: 30.2s
58:	learn: 0.0165225	total: 7.78s	remaining: 29.9s
59:	learn: 0.0162905	total: 7.9s	remaining: 29.8s
60:	learn: 0.0159598	total: 8.01s	remaining: 29.5s
61:	learn: 0.0156287	total: 8.1s	remaining: 29.3s
62:	learn: 0.0153568	total: 8.21s	remaining: 29.1s
63:	learn: 0.0150669	total: 8.29s	remaining: 28.8s
64:	learn: 0.0147733	total: 8.39s	remaining: 28.5s
65:	learn: 0.0144461	total: 8.5s	remaining: 28.3s
66:	learn: 0.0141259	total: 8.59s	remaining: 28.1s
67:	learn: 0.0138900	total: 8.67s	remaining: 27.8s
68:	learn: 0.0136038	total: 8.79s	remaining: 27.6s
69:	learn: 0.0134223	total: 8.89s	remaining: 27.4s
70:	learn: 0.0132133	total: 8.97s	remaining: 27.2s
71:	learn: 0.0129936	total: 9.09s	remaining: 27s
72:	learn: 0.0128550	total: 9.16s	remaining: 26.7s
73:	learn: 0.0126244	total: 9.26s	remaining: 26.5s
74:	learn: 0.0124659	total: 9.36s	remaining: 26.3s
75:	learn: 0.0121872	total: 9.45s	remaining: 26.1s
76:	learn: 0.0120540	total: 9.55s	remaining: 25.9s
77:	learn: 0.0118497	total: 9.65s	remaining: 25.7s
78:	learn: 0.0116678	total: 9.74s	remaining: 25.5s
79:	learn: 0.0115412	total: 9.83s	remaining: 25.3s
80:	learn: 0.0113257	total: 9.95s	remaining: 25.2s
81:	learn: 0.0111462	total: 10s	remaining: 25s
82:	learn: 0.0110392	total: 10.1s	remaining: 24.8s
83:	learn: 0.0108655	total: 10.2s	remaining: 24.6s
84:	learn: 0.0107178	total: 10.3s	remaining: 24.4s
85:	learn: 0.0106026	total: 10.4s	remaining: 24.2s
86:	learn: 0.0104728	total: 10.5s	remaining: 24s
87:	learn: 0.0103763	total: 10.6s	remaining: 23.8s
88:	learn: 0.0102481	total: 10.7s	remaining: 23.6s
89:	learn: 0.0101251	total: 10.8s	remaining: 23.5s
90:	learn: 0.0100485	total: 10.9s	remaining: 23.3s
91:	learn: 0.0098700	total: 11s	remaining: 23.2s
92:	learn: 0.0097665	total: 11.1s	remaining: 23.1s
93:	learn: 0.0095945	total: 11.2s	remaining: 22.9s
94:	learn: 0.0095055	total: 11.3s	remaining: 22.7s
95:	learn: 0.0093846	total: 11.4s	remaining: 22.6s
96:	learn: 0.0092345	total: 11.5s	remaining: 22.4s
97:	learn: 0.0090856	total: 11.6s	remaining: 22.2s
98:	learn: 0.0090184	total: 11.7s	remaining: 22.1s
99:	learn: 0.0088950	total: 11.8s	remaining: 21.9s
100:	learn: 0.0087418	total: 11.9s	remaining: 21.7s
101:	learn: 0.0086294	total: 12s	remaining: 21.6s
102:	learn: 0.0084932	total: 12.1s	remaining: 21.4s
103:	learn: 0.0083955	total: 12.2s	remaining: 21.3s
104:	learn: 0.0082338	total: 12.3s	remaining: 21.2s
105:	learn: 0.0081267	total: 12.4s	remaining: 21s
106:	learn: 0.0080548	total: 12.4s	remaining: 20.8s
107:	learn: 0.0079876	total: 12.6s	remaining: 20.7s
108:	learn: 0.0079283	total: 12.6s	remaining: 20.5s
109:	learn: 0.0078527	total: 12.7s	remaining: 20.4s
110:	learn: 0.0078104	total: 12.8s	remaining: 20.2s
111:	learn: 0.0077607	total: 12.9s	remaining: 20.1s
112:	learn: 0.0076714	total: 13s	remaining: 19.9s
113:	learn: 0.0076019	total: 13.1s	remaining: 19.8s
114:	learn: 0.0075390	total: 13.2s	remaining: 19.6s
115:	learn: 0.0075024	total: 13.3s	remaining: 19.4s
116:	learn: 0.0074266	total: 13.4s	remaining: 19.3s
117:	learn: 0.0073501	total: 13.5s	remaining: 19.2s
118:	learn: 0.0072698	total: 13.5s	remaining: 19s
119:	learn: 0.0072034	total: 13.6s	remaining: 18.9s
120:	learn: 0.0070527	total: 13.7s	remaining: 18.7s
121:	learn: 0.0069276	total: 13.8s	remaining: 18.6s
122:	learn: 0.0068139	total: 13.9s	remaining: 18.5s
123:	learn: 0.0067498	total: 14s	remaining: 18.3s
124:	learn: 0.0066714	total: 14.1s	remaining: 18.2s
125:	learn: 0.0065412	total: 14.2s	remaining: 18.1s
126:	learn: 0.0064710	total: 14.3s	remaining: 18s
127:	learn: 0.0064005	total: 14.5s	remaining: 17.9s
128:	learn: 0.0063179	total: 14.6s	remaining: 17.8s
129:	learn: 0.0062657	total: 14.8s	remaining: 17.8s
130:	learn: 0.0062015	total: 15s	remaining: 17.7s
131:	learn: 0.0060803	total: 15.2s	remaining: 17.7s
132:	learn: 0.0060081	total: 15.4s	remaining: 17.7s
133:	learn: 0.0059293	total: 15.5s	remaining: 17.6s
134:	learn: 0.0058269	total: 15.7s	remaining: 17.6s
135:	learn: 0.0057699	total: 15.9s	remaining: 17.6s
136:	learn: 0.0057133	total: 16.1s	remaining: 17.5s
137:	learn: 0.0056791	total: 16.2s	remaining: 17.4s
138:	learn: 0.0056495	total: 16.4s	remaining: 17.3s
139:	learn: 0.0056059	total: 16.6s	remaining: 17.3s
140:	learn: 0.0055500	total: 16.7s	remaining: 17.2s
141:	learn: 0.0054977	total: 16.9s	remaining: 17.1s
142:	learn: 0.0053629	total: 17.1s	remaining: 17.1s
143:	learn: 0.0053097	total: 17.2s	remaining: 17s
144:	learn: 0.0052821	total: 17.4s	remaining: 16.9s
145:	learn: 0.0052688	total: 17.6s	remaining: 16.9s
146:	learn: 0.0052243	total: 17.8s	remaining: 16.8s
147:	learn: 0.0051714	total: 17.9s	remaining: 16.7s
148:	learn: 0.0050901	total: 18.1s	remaining: 16.6s
149:	learn: 0.0050298	total: 18.3s	remaining: 16.6s
150:	learn: 0.0049897	total: 18.4s	remaining: 16.5s
151:	learn: 0.0049515	total: 18.6s	remaining: 16.4s
152:	learn: 0.0049303	total: 18.8s	remaining: 16.3s
153:	learn: 0.0048735	total: 19s	remaining: 16.3s
154:	learn: 0.0048332	total: 19.1s	remaining: 16.2s
155:	learn: 0.0048226	total: 19.3s	remaining: 16.1s
156:	learn: 0.0047983	total: 19.5s	remaining: 16s
157:	learn: 0.0047709	total: 19.6s	remaining: 15.9s
158:	learn: 0.0046972	total: 19.8s	remaining: 15.8s
159:	learn: 0.0046610	total: 20s	remaining: 15.7s
160:	learn: 0.0046379	total: 20.1s	remaining: 15.6s
161:	learn: 0.0045779	total: 20.2s	remaining: 15.5s
162:	learn: 0.0045489	total: 20.3s	remaining: 15.3s
163:	learn: 0.0045300	total: 20.4s	remaining: 15.2s
164:	learn: 0.0044883	total: 20.5s	remaining: 15.1s
165:	learn: 0.0044455	total: 20.6s	remaining: 14.9s
166:	learn: 0.0044276	total: 20.7s	remaining: 14.7s
167:	learn: 0.0044005	total: 20.8s	remaining: 14.6s
168:	learn: 0.0043774	total: 20.9s	remaining: 14.4s
169:	learn: 0.0043639	total: 20.9s	remaining: 14.3s
170:	learn: 0.0043238	total: 21s	remaining: 14.2s
171:	learn: 0.0042879	total: 21.1s	remaining: 14s
172:	learn: 0.0042428	total: 21.2s	remaining: 13.9s
173:	learn: 0.0041943	total: 21.3s	remaining: 13.7s
174:	learn: 0.0041686	total: 21.4s	remaining: 13.6s
175:	learn: 0.0041428	total: 21.5s	remaining: 13.4s
176:	learn: 0.0041222	total: 21.6s	remaining: 13.3s
177:	learn: 0.0041092	total: 21.7s	remaining: 13.2s
178:	learn: 0.0040847	total: 21.8s	remaining: 13s
179:	learn: 0.0040657	total: 21.9s	remaining: 12.9s
180:	learn: 0.0040448	total: 22s	remaining: 12.8s
181:	learn: 0.0040041	total: 22.1s	remaining: 12.6s
182:	learn: 0.0039799	total: 22.2s	remaining: 12.5s
183:	learn: 0.0039415	total: 22.3s	remaining: 12.3s
184:	learn: 0.0039112	total: 22.3s	remaining: 12.2s
185:	learn: 0.0038747	total: 22.5s	remaining: 12.1s
186:	learn: 0.0038268	total: 22.6s	remaining: 11.9s
187:	learn: 0.0038045	total: 22.6s	remaining: 11.8s
188:	learn: 0.0038045	total: 22.7s	remaining: 11.7s
189:	learn: 0.0037848	total: 22.8s	remaining: 11.5s
190:	learn: 0.0037599	total: 22.9s	remaining: 11.4s
191:	learn: 0.0037263	total: 23s	remaining: 11.3s
192:	learn: 0.0037000	total: 23.1s	remaining: 11.1s
193:	learn: 0.0037000	total: 23.2s	remaining: 11s
194:	learn: 0.0036578	total: 23.3s	remaining: 10.9s
195:	learn: 0.0036425	total: 23.4s	remaining: 10.7s
196:	learn: 0.0036239	total: 23.5s	remaining: 10.6s
197:	learn: 0.0035768	total: 23.6s	remaining: 10.5s
198:	learn: 0.0035501	total: 23.7s	remaining: 10.3s
199:	learn: 0.0035196	total: 23.8s	remaining: 10.2s
200:	learn: 0.0035066	total: 23.9s	remaining: 10.1s
201:	learn: 0.0034715	total: 24s	remaining: 9.96s
202:	learn: 0.0034715	total: 24s	remaining: 9.82s
203:	learn: 0.0034383	total: 24.1s	remaining: 9.7s
204:	learn: 0.0034246	total: 24.2s	remaining: 9.56s
205:	learn: 0.0034246	total: 24.3s	remaining: 9.43s
206:	learn: 0.0034246	total: 24.4s	remaining: 9.3s
207:	learn: 0.0034086	total: 24.4s	remaining: 9.17s
208:	learn: 0.0033994	total: 24.5s	remaining: 9.04s
209:	learn: 0.0033994	total: 24.6s	remaining: 8.91s
210:	learn: 0.0033994	total: 24.7s	remaining: 8.78s
211:	learn: 0.0033939	total: 24.8s	remaining: 8.65s
212:	learn: 0.0033939	total: 24.9s	remaining: 8.52s
213:	learn: 0.0033893	total: 25s	remaining: 8.4s
214:	learn: 0.0033818	total: 25s	remaining: 8.27s
215:	learn: 0.0033691	total: 25.1s	remaining: 8.15s
216:	learn: 0.0033501	total: 25.2s	remaining: 8.02s
217:	learn: 0.0033300	total: 25.3s	remaining: 7.89s
218:	learn: 0.0033263	total: 25.4s	remaining: 7.77s
219:	learn: 0.0032942	total: 25.5s	remaining: 7.65s
220:	learn: 0.0032793	total: 25.6s	remaining: 7.53s
221:	learn: 0.0032573	total: 25.7s	remaining: 7.41s
222:	learn: 0.0032212	total: 25.8s	remaining: 7.29s
223:	learn: 0.0032212	total: 25.9s	remaining: 7.16s
224:	learn: 0.0032076	total: 26s	remaining: 7.04s
225:	learn: 0.0032076	total: 26s	remaining: 6.91s
226:	learn: 0.0032076	total: 26.1s	remaining: 6.79s
227:	learn: 0.0032076	total: 26.2s	remaining: 6.66s
228:	learn: 0.0031938	total: 26.3s	remaining: 6.54s
229:	learn: 0.0031854	total: 26.4s	remaining: 6.42s
230:	learn: 0.0031684	total: 26.5s	remaining: 6.3s
231:	learn: 0.0031684	total: 26.6s	remaining: 6.18s
232:	learn: 0.0031685	total: 26.6s	remaining: 6.06s
233:	learn: 0.0031599	total: 26.7s	remaining: 5.94s
234:	learn: 0.0031442	total: 26.8s	remaining: 5.82s
235:	learn: 0.0031442	total: 26.9s	remaining: 5.7s
236:	learn: 0.0031442	total: 27s	remaining: 5.58s
237:	learn: 0.0031442	total: 27s	remaining: 5.45s
238:	learn: 0.0031442	total: 27.1s	remaining: 5.33s
239:	learn: 0.0031442	total: 27.2s	remaining: 5.21s
240:	learn: 0.0031442	total: 27.3s	remaining: 5.09s
241:	learn: 0.0031441	total: 27.3s	remaining: 4.97s
242:	learn: 0.0031441	total: 27.4s	remaining: 4.85s
243:	learn: 0.0031441	total: 27.5s	remaining: 4.73s
244:	learn: 0.0031254	total: 27.6s	remaining: 4.62s
245:	learn: 0.0031254	total: 27.7s	remaining: 4.5s
246:	learn: 0.0031254	total: 27.8s	remaining: 4.38s
247:	learn: 0.0031254	total: 27.8s	remaining: 4.26s
248:	learn: 0.0031254	total: 27.9s	remaining: 4.14s
249:	learn: 0.0031254	total: 28s	remaining: 4.03s
250:	learn: 0.0031254	total: 28s	remaining: 3.91s
251:	learn: 0.0031254	total: 28.1s	remaining: 3.79s
252:	learn: 0.0031254	total: 28.2s	remaining: 3.68s
253:	learn: 0.0031254	total: 28.3s	remaining: 3.56s
254:	learn: 0.0031254	total: 28.3s	remaining: 3.44s
255:	learn: 0.0031254	total: 28.4s	remaining: 3.33s
256:	learn: 0.0031254	total: 28.5s	remaining: 3.21s
257:	learn: 0.0031254	total: 28.6s	remaining: 3.1s
258:	learn: 0.0031254	total: 28.7s	remaining: 2.99s
259:	learn: 0.0031254	total: 28.7s	remaining: 2.87s
260:	learn: 0.0031254	total: 28.8s	remaining: 2.76s
261:	learn: 0.0031254	total: 28.9s	remaining: 2.65s
262:	learn: 0.0031254	total: 29s	remaining: 2.54s
263:	learn: 0.0031254	total: 29.1s	remaining: 2.42s
264:	learn: 0.0031254	total: 29.2s	remaining: 2.31s
265:	learn: 0.0031254	total: 29.2s	remaining: 2.2s
266:	learn: 0.0031254	total: 29.3s	remaining: 2.08s
267:	learn: 0.0031254	total: 29.4s	remaining: 1.97s
268:	learn: 0.0031254	total: 29.4s	remaining: 1.86s
269:	learn: 0.0031253	total: 29.5s	remaining: 1.75s
270:	learn: 0.0031253	total: 29.6s	remaining: 1.64s
271:	learn: 0.0031253	total: 29.7s	remaining: 1.53s
272:	learn: 0.0031253	total: 29.8s	remaining: 1.42s
273:	learn: 0.0031253	total: 29.9s	remaining: 1.31s
274:	learn: 0.0031254	total: 29.9s	remaining: 1.2s
275:	learn: 0.0031254	total: 30s	remaining: 1.09s
276:	learn: 0.0031254	total: 30.1s	remaining: 979ms
277:	learn: 0.0031254	total: 30.3s	remaining: 871ms
278:	learn: 0.0031254	total: 30.4s	remaining: 763ms
279:	learn: 0.0031253	total: 30.6s	remaining: 655ms
280:	learn: 0.0031253	total: 30.7s	remaining: 546ms
281:	learn: 0.0031253	total: 30.8s	remaining: 437ms
282:	learn: 0.0031253	total: 30.9s	remaining: 328ms
283:	learn: 0.0031253	total: 31.1s	remaining: 219ms
284:	learn: 0.0031253	total: 31.2s	remaining: 110ms
285:	learn: 0.0031253	total: 31.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.67
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 23.95
 - Recall_Test: 85.71
 - AUPRC_Test: 79.86
 - Accuracy_Test: 99.52
 - F1-Score_Test: 37.44
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.58
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5615555	total: 170ms	remaining: 48.6s
1:	learn: 0.4293042	total: 351ms	remaining: 49.9s
2:	learn: 0.3467176	total: 556ms	remaining: 52.4s
3:	learn: 0.2870703	total: 724ms	remaining: 51.1s
4:	learn: 0.2458609	total: 918ms	remaining: 51.6s
5:	learn: 0.2090956	total: 1.09s	remaining: 51s
6:	learn: 0.1816233	total: 1.25s	remaining: 49.7s
7:	learn: 0.1592528	total: 1.35s	remaining: 47s
8:	learn: 0.1461696	total: 1.44s	remaining: 44.3s
9:	learn: 0.1341996	total: 1.55s	remaining: 42.7s
10:	learn: 0.1208308	total: 1.64s	remaining: 41s
11:	learn: 0.1113051	total: 1.73s	remaining: 39.5s
12:	learn: 0.1039011	total: 1.84s	remaining: 38.7s
13:	learn: 0.0950991	total: 1.94s	remaining: 37.6s
14:	learn: 0.0907586	total: 2.02s	remaining: 36.6s
15:	learn: 0.0863755	total: 2.14s	remaining: 36.1s
16:	learn: 0.0821807	total: 2.23s	remaining: 35.2s
17:	learn: 0.0791167	total: 2.31s	remaining: 34.4s
18:	learn: 0.0757233	total: 2.44s	remaining: 34.3s
19:	learn: 0.0722217	total: 2.52s	remaining: 33.5s
20:	learn: 0.0688041	total: 2.61s	remaining: 32.9s
21:	learn: 0.0657355	total: 2.73s	remaining: 32.8s
22:	learn: 0.0626293	total: 2.82s	remaining: 32.2s
23:	learn: 0.0606442	total: 2.91s	remaining: 31.8s
24:	learn: 0.0590652	total: 3.02s	remaining: 31.5s
25:	learn: 0.0565925	total: 3.12s	remaining: 31.2s
26:	learn: 0.0542916	total: 3.22s	remaining: 30.9s
27:	learn: 0.0520954	total: 3.33s	remaining: 30.6s
28:	learn: 0.0501475	total: 3.45s	remaining: 30.6s
29:	learn: 0.0484429	total: 3.55s	remaining: 30.3s
30:	learn: 0.0473518	total: 3.64s	remaining: 29.9s
31:	learn: 0.0458543	total: 3.75s	remaining: 29.8s
32:	learn: 0.0451302	total: 3.84s	remaining: 29.4s
33:	learn: 0.0445032	total: 3.92s	remaining: 29s
34:	learn: 0.0436865	total: 4.03s	remaining: 28.9s
35:	learn: 0.0427271	total: 4.11s	remaining: 28.6s
36:	learn: 0.0412601	total: 4.2s	remaining: 28.3s
37:	learn: 0.0402426	total: 4.32s	remaining: 28.2s
38:	learn: 0.0390439	total: 4.41s	remaining: 27.9s
39:	learn: 0.0377705	total: 4.5s	remaining: 27.7s
40:	learn: 0.0371691	total: 4.62s	remaining: 27.6s
41:	learn: 0.0362855	total: 4.7s	remaining: 27.3s
42:	learn: 0.0357249	total: 4.79s	remaining: 27.1s
43:	learn: 0.0347426	total: 4.91s	remaining: 27s
44:	learn: 0.0340342	total: 4.99s	remaining: 26.7s
45:	learn: 0.0332663	total: 5.07s	remaining: 26.5s
46:	learn: 0.0323613	total: 5.19s	remaining: 26.4s
47:	learn: 0.0316180	total: 5.28s	remaining: 26.2s
48:	learn: 0.0311026	total: 5.37s	remaining: 26s
49:	learn: 0.0306368	total: 5.49s	remaining: 25.9s
50:	learn: 0.0298853	total: 5.58s	remaining: 25.7s
51:	learn: 0.0291304	total: 5.68s	remaining: 25.5s
52:	learn: 0.0285411	total: 5.81s	remaining: 25.6s
53:	learn: 0.0278282	total: 5.9s	remaining: 25.4s
54:	learn: 0.0270871	total: 6s	remaining: 25.2s
55:	learn: 0.0265476	total: 6.12s	remaining: 25.1s
56:	learn: 0.0260817	total: 6.2s	remaining: 24.9s
57:	learn: 0.0256481	total: 6.29s	remaining: 24.7s
58:	learn: 0.0251215	total: 6.39s	remaining: 24.6s
59:	learn: 0.0245264	total: 6.49s	remaining: 24.5s
60:	learn: 0.0239935	total: 6.58s	remaining: 24.3s
61:	learn: 0.0235164	total: 6.69s	remaining: 24.2s
62:	learn: 0.0231441	total: 6.78s	remaining: 24s
63:	learn: 0.0227814	total: 6.87s	remaining: 23.8s
64:	learn: 0.0225307	total: 6.97s	remaining: 23.7s
65:	learn: 0.0220080	total: 7.07s	remaining: 23.6s
66:	learn: 0.0215474	total: 7.16s	remaining: 23.4s
67:	learn: 0.0212438	total: 7.27s	remaining: 23.3s
68:	learn: 0.0208477	total: 7.36s	remaining: 23.1s
69:	learn: 0.0203897	total: 7.46s	remaining: 23s
70:	learn: 0.0201201	total: 7.58s	remaining: 23s
71:	learn: 0.0198929	total: 7.66s	remaining: 22.8s
72:	learn: 0.0194001	total: 7.75s	remaining: 22.6s
73:	learn: 0.0190495	total: 7.86s	remaining: 22.5s
74:	learn: 0.0188757	total: 7.94s	remaining: 22.3s
75:	learn: 0.0187692	total: 8.02s	remaining: 22.2s
76:	learn: 0.0185937	total: 8.14s	remaining: 22.1s
77:	learn: 0.0182596	total: 8.24s	remaining: 22s
78:	learn: 0.0179867	total: 8.32s	remaining: 21.8s
79:	learn: 0.0175983	total: 8.44s	remaining: 21.7s
80:	learn: 0.0171474	total: 8.52s	remaining: 21.6s
81:	learn: 0.0169882	total: 8.62s	remaining: 21.4s
82:	learn: 0.0168040	total: 8.73s	remaining: 21.4s
83:	learn: 0.0164298	total: 8.82s	remaining: 21.2s
84:	learn: 0.0161404	total: 8.91s	remaining: 21.1s
85:	learn: 0.0159537	total: 9.03s	remaining: 21s
86:	learn: 0.0157530	total: 9.11s	remaining: 20.8s
87:	learn: 0.0156454	total: 9.19s	remaining: 20.7s
88:	learn: 0.0154312	total: 9.3s	remaining: 20.6s
89:	learn: 0.0152677	total: 9.38s	remaining: 20.4s
90:	learn: 0.0151480	total: 9.46s	remaining: 20.3s
91:	learn: 0.0149199	total: 9.58s	remaining: 20.2s
92:	learn: 0.0146832	total: 9.67s	remaining: 20.1s
93:	learn: 0.0144208	total: 9.76s	remaining: 19.9s
94:	learn: 0.0142205	total: 9.87s	remaining: 19.8s
95:	learn: 0.0140778	total: 9.96s	remaining: 19.7s
96:	learn: 0.0138361	total: 10.1s	remaining: 19.6s
97:	learn: 0.0135588	total: 10.2s	remaining: 19.6s
98:	learn: 0.0133141	total: 10.3s	remaining: 19.5s
99:	learn: 0.0131899	total: 10.4s	remaining: 19.3s
100:	learn: 0.0130289	total: 10.5s	remaining: 19.3s
101:	learn: 0.0128282	total: 10.6s	remaining: 19.2s
102:	learn: 0.0126669	total: 10.7s	remaining: 19s
103:	learn: 0.0124482	total: 10.8s	remaining: 18.9s
104:	learn: 0.0122920	total: 10.9s	remaining: 18.8s
105:	learn: 0.0121012	total: 11s	remaining: 18.7s
106:	learn: 0.0119861	total: 11.1s	remaining: 18.6s
107:	learn: 0.0118536	total: 11.3s	remaining: 18.5s
108:	learn: 0.0117069	total: 11.4s	remaining: 18.5s
109:	learn: 0.0116039	total: 11.6s	remaining: 18.5s
110:	learn: 0.0115051	total: 11.7s	remaining: 18.5s
111:	learn: 0.0113489	total: 11.9s	remaining: 18.5s
112:	learn: 0.0112298	total: 12.1s	remaining: 18.5s
113:	learn: 0.0111551	total: 12.2s	remaining: 18.5s
114:	learn: 0.0110262	total: 12.4s	remaining: 18.5s
115:	learn: 0.0109741	total: 12.6s	remaining: 18.5s
116:	learn: 0.0108101	total: 12.8s	remaining: 18.5s
117:	learn: 0.0106709	total: 12.9s	remaining: 18.4s
118:	learn: 0.0105115	total: 13.1s	remaining: 18.4s
119:	learn: 0.0104448	total: 13.3s	remaining: 18.4s
120:	learn: 0.0103260	total: 13.5s	remaining: 18.4s
121:	learn: 0.0102010	total: 13.6s	remaining: 18.3s
122:	learn: 0.0100879	total: 13.8s	remaining: 18.3s
123:	learn: 0.0100288	total: 13.9s	remaining: 18.2s
124:	learn: 0.0098809	total: 14.1s	remaining: 18.2s
125:	learn: 0.0098181	total: 14.3s	remaining: 18.1s
126:	learn: 0.0096878	total: 14.5s	remaining: 18.1s
127:	learn: 0.0095665	total: 14.6s	remaining: 18.1s
128:	learn: 0.0094027	total: 14.8s	remaining: 18s
129:	learn: 0.0093494	total: 15s	remaining: 18s
130:	learn: 0.0092423	total: 15.2s	remaining: 18s
131:	learn: 0.0091230	total: 15.4s	remaining: 17.9s
132:	learn: 0.0090339	total: 15.5s	remaining: 17.9s
133:	learn: 0.0089637	total: 15.7s	remaining: 17.8s
134:	learn: 0.0088696	total: 15.9s	remaining: 17.8s
135:	learn: 0.0088098	total: 16.1s	remaining: 17.7s
136:	learn: 0.0087603	total: 16.2s	remaining: 17.7s
137:	learn: 0.0086510	total: 16.4s	remaining: 17.6s
138:	learn: 0.0085894	total: 16.6s	remaining: 17.5s
139:	learn: 0.0085414	total: 16.7s	remaining: 17.4s
140:	learn: 0.0084486	total: 16.9s	remaining: 17.4s
141:	learn: 0.0083197	total: 17s	remaining: 17.3s
142:	learn: 0.0082150	total: 17.1s	remaining: 17.1s
143:	learn: 0.0081423	total: 17.2s	remaining: 17s
144:	learn: 0.0080825	total: 17.3s	remaining: 16.8s
145:	learn: 0.0080316	total: 17.4s	remaining: 16.7s
146:	learn: 0.0079808	total: 17.5s	remaining: 16.6s
147:	learn: 0.0079198	total: 17.6s	remaining: 16.4s
148:	learn: 0.0078161	total: 17.7s	remaining: 16.3s
149:	learn: 0.0077628	total: 17.8s	remaining: 16.1s
150:	learn: 0.0076774	total: 17.9s	remaining: 16s
151:	learn: 0.0076243	total: 18s	remaining: 15.9s
152:	learn: 0.0075766	total: 18.1s	remaining: 15.7s
153:	learn: 0.0074601	total: 18.2s	remaining: 15.6s
154:	learn: 0.0073984	total: 18.3s	remaining: 15.5s
155:	learn: 0.0072928	total: 18.4s	remaining: 15.3s
156:	learn: 0.0072593	total: 18.5s	remaining: 15.2s
157:	learn: 0.0072140	total: 18.6s	remaining: 15s
158:	learn: 0.0071518	total: 18.6s	remaining: 14.9s
159:	learn: 0.0070709	total: 18.7s	remaining: 14.8s
160:	learn: 0.0069773	total: 18.8s	remaining: 14.6s
161:	learn: 0.0068556	total: 18.9s	remaining: 14.5s
162:	learn: 0.0068429	total: 19s	remaining: 14.4s
163:	learn: 0.0067526	total: 19.1s	remaining: 14.2s
164:	learn: 0.0067082	total: 19.2s	remaining: 14.1s
165:	learn: 0.0066643	total: 19.3s	remaining: 14s
166:	learn: 0.0065500	total: 19.4s	remaining: 13.8s
167:	learn: 0.0064653	total: 19.5s	remaining: 13.7s
168:	learn: 0.0063811	total: 19.6s	remaining: 13.6s
169:	learn: 0.0063705	total: 19.7s	remaining: 13.5s
170:	learn: 0.0063330	total: 19.8s	remaining: 13.3s
171:	learn: 0.0062540	total: 19.9s	remaining: 13.2s
172:	learn: 0.0062204	total: 20s	remaining: 13.1s
173:	learn: 0.0061612	total: 20.1s	remaining: 12.9s
174:	learn: 0.0060777	total: 20.2s	remaining: 12.8s
175:	learn: 0.0060358	total: 20.3s	remaining: 12.7s
176:	learn: 0.0059957	total: 20.4s	remaining: 12.6s
177:	learn: 0.0059602	total: 20.5s	remaining: 12.4s
178:	learn: 0.0059291	total: 20.6s	remaining: 12.3s
179:	learn: 0.0058850	total: 20.7s	remaining: 12.2s
180:	learn: 0.0058532	total: 20.8s	remaining: 12s
181:	learn: 0.0057821	total: 20.9s	remaining: 11.9s
182:	learn: 0.0057206	total: 21s	remaining: 11.8s
183:	learn: 0.0056570	total: 21.1s	remaining: 11.7s
184:	learn: 0.0055939	total: 21.2s	remaining: 11.6s
185:	learn: 0.0055689	total: 21.3s	remaining: 11.4s
186:	learn: 0.0054973	total: 21.4s	remaining: 11.3s
187:	learn: 0.0054556	total: 21.5s	remaining: 11.2s
188:	learn: 0.0054170	total: 21.6s	remaining: 11.1s
189:	learn: 0.0053671	total: 21.6s	remaining: 10.9s
190:	learn: 0.0053545	total: 21.8s	remaining: 10.8s
191:	learn: 0.0053164	total: 21.8s	remaining: 10.7s
192:	learn: 0.0052278	total: 21.9s	remaining: 10.6s
193:	learn: 0.0052118	total: 22s	remaining: 10.5s
194:	learn: 0.0051847	total: 22.1s	remaining: 10.3s
195:	learn: 0.0051295	total: 22.2s	remaining: 10.2s
196:	learn: 0.0050852	total: 22.3s	remaining: 10.1s
197:	learn: 0.0050502	total: 22.4s	remaining: 9.97s
198:	learn: 0.0050056	total: 22.5s	remaining: 9.86s
199:	learn: 0.0049687	total: 22.6s	remaining: 9.73s
200:	learn: 0.0049687	total: 22.7s	remaining: 9.6s
201:	learn: 0.0049325	total: 22.8s	remaining: 9.48s
202:	learn: 0.0048942	total: 22.9s	remaining: 9.36s
203:	learn: 0.0048777	total: 23s	remaining: 9.23s
204:	learn: 0.0048307	total: 23.1s	remaining: 9.12s
205:	learn: 0.0048306	total: 23.2s	remaining: 8.99s
206:	learn: 0.0047890	total: 23.3s	remaining: 8.87s
207:	learn: 0.0047822	total: 23.4s	remaining: 8.76s
208:	learn: 0.0047822	total: 23.4s	remaining: 8.64s
209:	learn: 0.0047263	total: 23.5s	remaining: 8.52s
210:	learn: 0.0047134	total: 23.6s	remaining: 8.4s
211:	learn: 0.0047134	total: 23.7s	remaining: 8.27s
212:	learn: 0.0046538	total: 23.8s	remaining: 8.15s
213:	learn: 0.0046538	total: 23.9s	remaining: 8.03s
214:	learn: 0.0046413	total: 24s	remaining: 7.92s
215:	learn: 0.0046306	total: 24s	remaining: 7.79s
216:	learn: 0.0046063	total: 24.2s	remaining: 7.68s
217:	learn: 0.0045836	total: 24.2s	remaining: 7.56s
218:	learn: 0.0045319	total: 24.3s	remaining: 7.45s
219:	learn: 0.0045141	total: 24.5s	remaining: 7.34s
220:	learn: 0.0044464	total: 24.5s	remaining: 7.22s
221:	learn: 0.0043816	total: 24.6s	remaining: 7.1s
222:	learn: 0.0043501	total: 24.7s	remaining: 6.99s
223:	learn: 0.0043501	total: 24.8s	remaining: 6.87s
224:	learn: 0.0043501	total: 24.9s	remaining: 6.74s
225:	learn: 0.0042938	total: 25s	remaining: 6.64s
226:	learn: 0.0042858	total: 25.1s	remaining: 6.52s
227:	learn: 0.0042530	total: 25.2s	remaining: 6.41s
228:	learn: 0.0042040	total: 25.3s	remaining: 6.3s
229:	learn: 0.0041836	total: 25.4s	remaining: 6.19s
230:	learn: 0.0041596	total: 25.5s	remaining: 6.07s
231:	learn: 0.0041470	total: 25.6s	remaining: 5.96s
232:	learn: 0.0041470	total: 25.7s	remaining: 5.84s
233:	learn: 0.0041194	total: 25.7s	remaining: 5.72s
234:	learn: 0.0041043	total: 25.9s	remaining: 5.61s
235:	learn: 0.0040677	total: 25.9s	remaining: 5.5s
236:	learn: 0.0040229	total: 26s	remaining: 5.38s
237:	learn: 0.0040181	total: 26.1s	remaining: 5.27s
238:	learn: 0.0039957	total: 26.2s	remaining: 5.16s
239:	learn: 0.0039758	total: 26.3s	remaining: 5.04s
240:	learn: 0.0039664	total: 26.4s	remaining: 4.93s
241:	learn: 0.0039400	total: 26.5s	remaining: 4.82s
242:	learn: 0.0039290	total: 26.6s	remaining: 4.71s
243:	learn: 0.0039131	total: 26.7s	remaining: 4.6s
244:	learn: 0.0038982	total: 26.8s	remaining: 4.49s
245:	learn: 0.0038880	total: 26.9s	remaining: 4.37s
246:	learn: 0.0038525	total: 27s	remaining: 4.27s
247:	learn: 0.0038118	total: 27.2s	remaining: 4.16s
248:	learn: 0.0038073	total: 27.3s	remaining: 4.06s
249:	learn: 0.0038074	total: 27.4s	remaining: 3.95s
250:	learn: 0.0037771	total: 27.6s	remaining: 3.85s
251:	learn: 0.0037470	total: 27.8s	remaining: 3.75s
252:	learn: 0.0037144	total: 28s	remaining: 3.65s
253:	learn: 0.0037144	total: 28.1s	remaining: 3.54s
254:	learn: 0.0037144	total: 28.2s	remaining: 3.43s
255:	learn: 0.0036978	total: 28.3s	remaining: 3.32s
256:	learn: 0.0036883	total: 28.5s	remaining: 3.22s
257:	learn: 0.0036727	total: 28.7s	remaining: 3.11s
258:	learn: 0.0036727	total: 28.8s	remaining: 3s
259:	learn: 0.0036433	total: 29s	remaining: 2.9s
260:	learn: 0.0036433	total: 29.1s	remaining: 2.79s
261:	learn: 0.0036432	total: 29.2s	remaining: 2.67s
262:	learn: 0.0036432	total: 29.4s	remaining: 2.57s
263:	learn: 0.0036265	total: 29.5s	remaining: 2.46s
264:	learn: 0.0035985	total: 29.7s	remaining: 2.35s
265:	learn: 0.0035862	total: 29.8s	remaining: 2.24s
266:	learn: 0.0035862	total: 30s	remaining: 2.13s
267:	learn: 0.0035645	total: 30.1s	remaining: 2.02s
268:	learn: 0.0035151	total: 30.3s	remaining: 1.92s
269:	learn: 0.0034865	total: 30.5s	remaining: 1.81s
270:	learn: 0.0034599	total: 30.7s	remaining: 1.7s
271:	learn: 0.0034324	total: 30.9s	remaining: 1.59s
272:	learn: 0.0034324	total: 31s	remaining: 1.48s
273:	learn: 0.0034324	total: 31.1s	remaining: 1.36s
274:	learn: 0.0034324	total: 31.3s	remaining: 1.25s
275:	learn: 0.0034324	total: 31.4s	remaining: 1.14s
276:	learn: 0.0034324	total: 31.5s	remaining: 1.02s
277:	learn: 0.0034324	total: 31.7s	remaining: 911ms
278:	learn: 0.0034324	total: 31.8s	remaining: 798ms
279:	learn: 0.0034324	total: 31.9s	remaining: 684ms
280:	learn: 0.0034169	total: 32.1s	remaining: 571ms
281:	learn: 0.0034169	total: 32.2s	remaining: 457ms
282:	learn: 0.0034169	total: 32.4s	remaining: 343ms
283:	learn: 0.0034082	total: 32.5s	remaining: 229ms
284:	learn: 0.0034082	total: 32.7s	remaining: 115ms
285:	learn: 0.0033874	total: 32.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 27.96
 - Recall_Test: 88.10
 - AUPRC_Test: 75.93
 - Accuracy_Test: 99.60
 - F1-Score_Test: 42.45
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.58
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5589183	total: 85.5ms	remaining: 24.4s
1:	learn: 0.4304418	total: 171ms	remaining: 24.2s
2:	learn: 0.3387457	total: 257ms	remaining: 24.3s
3:	learn: 0.2837881	total: 365ms	remaining: 25.7s
4:	learn: 0.2434381	total: 444ms	remaining: 25s
5:	learn: 0.2049983	total: 541ms	remaining: 25.3s
6:	learn: 0.1782557	total: 660ms	remaining: 26.3s
7:	learn: 0.1605109	total: 738ms	remaining: 25.6s
8:	learn: 0.1402708	total: 826ms	remaining: 25.4s
9:	learn: 0.1258923	total: 938ms	remaining: 25.9s
10:	learn: 0.1149209	total: 1.03s	remaining: 25.7s
11:	learn: 0.1027197	total: 1.12s	remaining: 25.6s
12:	learn: 0.0932919	total: 1.24s	remaining: 26.1s
13:	learn: 0.0866845	total: 1.33s	remaining: 25.9s
14:	learn: 0.0815030	total: 1.43s	remaining: 25.8s
15:	learn: 0.0766027	total: 1.54s	remaining: 25.9s
16:	learn: 0.0707880	total: 1.64s	remaining: 26s
17:	learn: 0.0679472	total: 1.73s	remaining: 25.7s
18:	learn: 0.0644808	total: 1.86s	remaining: 26.1s
19:	learn: 0.0608415	total: 1.95s	remaining: 25.9s
20:	learn: 0.0577569	total: 2.04s	remaining: 25.8s
21:	learn: 0.0544115	total: 2.2s	remaining: 26.4s
22:	learn: 0.0528062	total: 2.29s	remaining: 26.2s
23:	learn: 0.0510874	total: 2.38s	remaining: 26s
24:	learn: 0.0490507	total: 2.5s	remaining: 26.1s
25:	learn: 0.0473794	total: 2.58s	remaining: 25.8s
26:	learn: 0.0459286	total: 2.69s	remaining: 25.8s
27:	learn: 0.0444752	total: 2.8s	remaining: 25.8s
28:	learn: 0.0429954	total: 2.9s	remaining: 25.7s
29:	learn: 0.0414585	total: 2.99s	remaining: 25.5s
30:	learn: 0.0401169	total: 3.11s	remaining: 25.6s
31:	learn: 0.0390797	total: 3.2s	remaining: 25.4s
32:	learn: 0.0381558	total: 3.29s	remaining: 25.2s
33:	learn: 0.0371756	total: 3.41s	remaining: 25.3s
34:	learn: 0.0358150	total: 3.5s	remaining: 25.1s
35:	learn: 0.0350616	total: 3.58s	remaining: 24.9s
36:	learn: 0.0342324	total: 3.71s	remaining: 25s
37:	learn: 0.0333731	total: 3.8s	remaining: 24.8s
38:	learn: 0.0325191	total: 3.89s	remaining: 24.7s
39:	learn: 0.0319508	total: 4s	remaining: 24.6s
40:	learn: 0.0310682	total: 4.1s	remaining: 24.5s
41:	learn: 0.0304622	total: 4.18s	remaining: 24.3s
42:	learn: 0.0299517	total: 4.3s	remaining: 24.3s
43:	learn: 0.0292305	total: 4.38s	remaining: 24.1s
44:	learn: 0.0283784	total: 4.48s	remaining: 24s
45:	learn: 0.0279417	total: 4.59s	remaining: 23.9s
46:	learn: 0.0272104	total: 4.69s	remaining: 23.8s
47:	learn: 0.0267874	total: 4.78s	remaining: 23.7s
48:	learn: 0.0263359	total: 4.89s	remaining: 23.7s
49:	learn: 0.0254592	total: 4.99s	remaining: 23.6s
50:	learn: 0.0250479	total: 5.07s	remaining: 23.4s
51:	learn: 0.0247159	total: 5.19s	remaining: 23.3s
52:	learn: 0.0241949	total: 5.27s	remaining: 23.2s
53:	learn: 0.0237310	total: 5.37s	remaining: 23.1s
54:	learn: 0.0232808	total: 5.48s	remaining: 23s
55:	learn: 0.0229586	total: 5.57s	remaining: 22.9s
56:	learn: 0.0225148	total: 5.66s	remaining: 22.7s
57:	learn: 0.0221374	total: 5.79s	remaining: 22.8s
58:	learn: 0.0215392	total: 5.88s	remaining: 22.6s
59:	learn: 0.0210233	total: 5.98s	remaining: 22.5s
60:	learn: 0.0207583	total: 6.09s	remaining: 22.5s
61:	learn: 0.0205522	total: 6.17s	remaining: 22.3s
62:	learn: 0.0202767	total: 6.26s	remaining: 22.2s
63:	learn: 0.0199967	total: 6.37s	remaining: 22.1s
64:	learn: 0.0197512	total: 6.46s	remaining: 22s
65:	learn: 0.0194627	total: 6.54s	remaining: 21.8s
66:	learn: 0.0191737	total: 6.68s	remaining: 21.8s
67:	learn: 0.0190152	total: 6.78s	remaining: 21.7s
68:	learn: 0.0186331	total: 6.88s	remaining: 21.6s
69:	learn: 0.0182022	total: 6.99s	remaining: 21.6s
70:	learn: 0.0178648	total: 7.1s	remaining: 21.5s
71:	learn: 0.0176988	total: 7.19s	remaining: 21.4s
72:	learn: 0.0173599	total: 7.3s	remaining: 21.3s
73:	learn: 0.0170292	total: 7.39s	remaining: 21.2s
74:	learn: 0.0167116	total: 7.49s	remaining: 21.1s
75:	learn: 0.0164266	total: 7.6s	remaining: 21s
76:	learn: 0.0161745	total: 7.69s	remaining: 20.9s
77:	learn: 0.0158994	total: 7.78s	remaining: 20.8s
78:	learn: 0.0157220	total: 7.91s	remaining: 20.7s
79:	learn: 0.0154601	total: 8.03s	remaining: 20.7s
80:	learn: 0.0151734	total: 8.2s	remaining: 20.8s
81:	learn: 0.0147649	total: 8.38s	remaining: 20.8s
82:	learn: 0.0144278	total: 8.55s	remaining: 20.9s
83:	learn: 0.0143241	total: 8.72s	remaining: 21s
84:	learn: 0.0141283	total: 8.91s	remaining: 21.1s
85:	learn: 0.0137831	total: 9.08s	remaining: 21.1s
86:	learn: 0.0135330	total: 9.26s	remaining: 21.2s
87:	learn: 0.0133365	total: 9.44s	remaining: 21.2s
88:	learn: 0.0131807	total: 9.59s	remaining: 21.2s
89:	learn: 0.0129536	total: 9.77s	remaining: 21.3s
90:	learn: 0.0127323	total: 9.93s	remaining: 21.3s
91:	learn: 0.0125377	total: 10.1s	remaining: 21.3s
92:	learn: 0.0124152	total: 10.3s	remaining: 21.3s
93:	learn: 0.0122939	total: 10.5s	remaining: 21.4s
94:	learn: 0.0121182	total: 10.6s	remaining: 21.4s
95:	learn: 0.0119538	total: 10.8s	remaining: 21.4s
96:	learn: 0.0117960	total: 11s	remaining: 21.5s
97:	learn: 0.0115481	total: 11.2s	remaining: 21.5s
98:	learn: 0.0114132	total: 11.4s	remaining: 21.5s
99:	learn: 0.0112410	total: 11.6s	remaining: 21.5s
100:	learn: 0.0110796	total: 11.7s	remaining: 21.5s
101:	learn: 0.0107964	total: 11.9s	remaining: 21.5s
102:	learn: 0.0106185	total: 12.1s	remaining: 21.5s
103:	learn: 0.0104131	total: 12.2s	remaining: 21.4s
104:	learn: 0.0103562	total: 12.4s	remaining: 21.4s
105:	learn: 0.0102164	total: 12.6s	remaining: 21.4s
106:	learn: 0.0101200	total: 12.8s	remaining: 21.4s
107:	learn: 0.0100580	total: 12.9s	remaining: 21.3s
108:	learn: 0.0100148	total: 13.1s	remaining: 21.3s
109:	learn: 0.0098293	total: 13.3s	remaining: 21.3s
110:	learn: 0.0096102	total: 13.5s	remaining: 21.3s
111:	learn: 0.0094921	total: 13.7s	remaining: 21.2s
112:	learn: 0.0094334	total: 13.8s	remaining: 21.2s
113:	learn: 0.0092982	total: 14s	remaining: 21.1s
114:	learn: 0.0091941	total: 14.1s	remaining: 20.9s
115:	learn: 0.0090951	total: 14.2s	remaining: 20.8s
116:	learn: 0.0089366	total: 14.3s	remaining: 20.6s
117:	learn: 0.0087826	total: 14.4s	remaining: 20.5s
118:	learn: 0.0086479	total: 14.5s	remaining: 20.3s
119:	learn: 0.0085349	total: 14.6s	remaining: 20.2s
120:	learn: 0.0084342	total: 14.7s	remaining: 20s
121:	learn: 0.0083057	total: 14.8s	remaining: 19.9s
122:	learn: 0.0082197	total: 14.9s	remaining: 19.7s
123:	learn: 0.0081471	total: 15s	remaining: 19.6s
124:	learn: 0.0080877	total: 15s	remaining: 19.4s
125:	learn: 0.0080545	total: 15.1s	remaining: 19.2s
126:	learn: 0.0079501	total: 15.3s	remaining: 19.1s
127:	learn: 0.0078534	total: 15.3s	remaining: 18.9s
128:	learn: 0.0077582	total: 15.5s	remaining: 18.8s
129:	learn: 0.0076585	total: 15.6s	remaining: 18.7s
130:	learn: 0.0075686	total: 15.6s	remaining: 18.5s
131:	learn: 0.0074634	total: 15.8s	remaining: 18.4s
132:	learn: 0.0073688	total: 15.9s	remaining: 18.3s
133:	learn: 0.0073356	total: 16s	remaining: 18.1s
134:	learn: 0.0072813	total: 16.1s	remaining: 18s
135:	learn: 0.0071858	total: 16.2s	remaining: 17.8s
136:	learn: 0.0071193	total: 16.3s	remaining: 17.7s
137:	learn: 0.0070737	total: 16.4s	remaining: 17.6s
138:	learn: 0.0069385	total: 16.5s	remaining: 17.4s
139:	learn: 0.0068894	total: 16.6s	remaining: 17.3s
140:	learn: 0.0068223	total: 16.7s	remaining: 17.1s
141:	learn: 0.0067340	total: 16.8s	remaining: 17s
142:	learn: 0.0066479	total: 16.8s	remaining: 16.8s
143:	learn: 0.0065510	total: 17s	remaining: 16.7s
144:	learn: 0.0064808	total: 17.1s	remaining: 16.6s
145:	learn: 0.0064218	total: 17.1s	remaining: 16.4s
146:	learn: 0.0063876	total: 17.3s	remaining: 16.3s
147:	learn: 0.0062935	total: 17.3s	remaining: 16.2s
148:	learn: 0.0062704	total: 17.4s	remaining: 16s
149:	learn: 0.0062362	total: 17.5s	remaining: 15.9s
150:	learn: 0.0061957	total: 17.6s	remaining: 15.7s
151:	learn: 0.0061428	total: 17.7s	remaining: 15.6s
152:	learn: 0.0060416	total: 17.8s	remaining: 15.5s
153:	learn: 0.0059981	total: 17.9s	remaining: 15.3s
154:	learn: 0.0058940	total: 18s	remaining: 15.2s
155:	learn: 0.0058528	total: 18.1s	remaining: 15.1s
156:	learn: 0.0057694	total: 18.2s	remaining: 15s
157:	learn: 0.0057387	total: 18.3s	remaining: 14.8s
158:	learn: 0.0057017	total: 18.4s	remaining: 14.7s
159:	learn: 0.0056636	total: 18.5s	remaining: 14.6s
160:	learn: 0.0055798	total: 18.6s	remaining: 14.4s
161:	learn: 0.0055387	total: 18.7s	remaining: 14.3s
162:	learn: 0.0055037	total: 18.8s	remaining: 14.2s
163:	learn: 0.0054302	total: 18.9s	remaining: 14s
164:	learn: 0.0054128	total: 19s	remaining: 13.9s
165:	learn: 0.0053028	total: 19.1s	remaining: 13.8s
166:	learn: 0.0052455	total: 19.2s	remaining: 13.7s
167:	learn: 0.0051943	total: 19.3s	remaining: 13.5s
168:	learn: 0.0051445	total: 19.4s	remaining: 13.4s
169:	learn: 0.0051106	total: 19.5s	remaining: 13.3s
170:	learn: 0.0050433	total: 19.6s	remaining: 13.2s
171:	learn: 0.0049829	total: 19.7s	remaining: 13.1s
172:	learn: 0.0049353	total: 19.8s	remaining: 12.9s
173:	learn: 0.0048931	total: 19.9s	remaining: 12.8s
174:	learn: 0.0048679	total: 20s	remaining: 12.7s
175:	learn: 0.0048063	total: 20.1s	remaining: 12.6s
176:	learn: 0.0047961	total: 20.2s	remaining: 12.4s
177:	learn: 0.0047281	total: 20.3s	remaining: 12.3s
178:	learn: 0.0046618	total: 20.4s	remaining: 12.2s
179:	learn: 0.0046120	total: 20.5s	remaining: 12.1s
180:	learn: 0.0045689	total: 20.6s	remaining: 12s
181:	learn: 0.0045256	total: 20.7s	remaining: 11.8s
182:	learn: 0.0045149	total: 20.8s	remaining: 11.7s
183:	learn: 0.0044763	total: 20.9s	remaining: 11.6s
184:	learn: 0.0044763	total: 21s	remaining: 11.5s
185:	learn: 0.0044489	total: 21.1s	remaining: 11.3s
186:	learn: 0.0044195	total: 21.2s	remaining: 11.2s
187:	learn: 0.0043927	total: 21.3s	remaining: 11.1s
188:	learn: 0.0043603	total: 21.4s	remaining: 11s
189:	learn: 0.0043211	total: 21.5s	remaining: 10.8s
190:	learn: 0.0042474	total: 21.6s	remaining: 10.7s
191:	learn: 0.0042275	total: 21.7s	remaining: 10.6s
192:	learn: 0.0042125	total: 21.7s	remaining: 10.5s
193:	learn: 0.0042027	total: 21.8s	remaining: 10.4s
194:	learn: 0.0041729	total: 21.9s	remaining: 10.2s
195:	learn: 0.0041248	total: 22s	remaining: 10.1s
196:	learn: 0.0041248	total: 22.1s	remaining: 9.98s
197:	learn: 0.0040739	total: 22.2s	remaining: 9.86s
198:	learn: 0.0040375	total: 22.3s	remaining: 9.74s
199:	learn: 0.0040206	total: 22.4s	remaining: 9.63s
200:	learn: 0.0039890	total: 22.5s	remaining: 9.5s
201:	learn: 0.0039613	total: 22.6s	remaining: 9.38s
202:	learn: 0.0039613	total: 22.6s	remaining: 9.26s
203:	learn: 0.0039474	total: 22.7s	remaining: 9.13s
204:	learn: 0.0038976	total: 22.8s	remaining: 9.02s
205:	learn: 0.0038683	total: 22.9s	remaining: 8.9s
206:	learn: 0.0038200	total: 23s	remaining: 8.79s
207:	learn: 0.0037905	total: 23.1s	remaining: 8.67s
208:	learn: 0.0037508	total: 23.2s	remaining: 8.56s
209:	learn: 0.0037141	total: 23.3s	remaining: 8.44s
210:	learn: 0.0036767	total: 23.4s	remaining: 8.32s
211:	learn: 0.0036501	total: 23.5s	remaining: 8.22s
212:	learn: 0.0036181	total: 23.6s	remaining: 8.1s
213:	learn: 0.0035982	total: 23.7s	remaining: 7.98s
214:	learn: 0.0035925	total: 23.9s	remaining: 7.88s
215:	learn: 0.0035619	total: 24s	remaining: 7.77s
216:	learn: 0.0035219	total: 24.2s	remaining: 7.68s
217:	learn: 0.0035046	total: 24.3s	remaining: 7.58s
218:	learn: 0.0034811	total: 24.5s	remaining: 7.5s
219:	learn: 0.0034588	total: 24.7s	remaining: 7.4s
220:	learn: 0.0034428	total: 24.8s	remaining: 7.3s
221:	learn: 0.0034241	total: 25s	remaining: 7.21s
222:	learn: 0.0033959	total: 25.2s	remaining: 7.12s
223:	learn: 0.0033673	total: 25.4s	remaining: 7.02s
224:	learn: 0.0033303	total: 25.6s	remaining: 6.93s
225:	learn: 0.0033303	total: 25.7s	remaining: 6.82s
226:	learn: 0.0033052	total: 25.9s	remaining: 6.72s
227:	learn: 0.0033052	total: 26s	remaining: 6.61s
228:	learn: 0.0032997	total: 26.1s	remaining: 6.51s
229:	learn: 0.0032656	total: 26.3s	remaining: 6.41s
230:	learn: 0.0032656	total: 26.5s	remaining: 6.3s
231:	learn: 0.0032656	total: 26.6s	remaining: 6.19s
232:	learn: 0.0032656	total: 26.7s	remaining: 6.08s
233:	learn: 0.0032414	total: 26.9s	remaining: 5.98s
234:	learn: 0.0032252	total: 27.1s	remaining: 5.88s
235:	learn: 0.0032252	total: 27.2s	remaining: 5.77s
236:	learn: 0.0032252	total: 27.4s	remaining: 5.66s
237:	learn: 0.0032252	total: 27.5s	remaining: 5.55s
238:	learn: 0.0032252	total: 27.7s	remaining: 5.44s
239:	learn: 0.0032220	total: 27.8s	remaining: 5.33s
240:	learn: 0.0032171	total: 28s	remaining: 5.22s
241:	learn: 0.0032146	total: 28.1s	remaining: 5.11s
242:	learn: 0.0031945	total: 28.3s	remaining: 5s
243:	learn: 0.0031591	total: 28.5s	remaining: 4.9s
244:	learn: 0.0031591	total: 28.6s	remaining: 4.78s
245:	learn: 0.0031330	total: 28.8s	remaining: 4.68s
246:	learn: 0.0031330	total: 28.9s	remaining: 4.57s
247:	learn: 0.0031330	total: 29s	remaining: 4.45s
248:	learn: 0.0031330	total: 29.2s	remaining: 4.34s
249:	learn: 0.0031330	total: 29.3s	remaining: 4.22s
250:	learn: 0.0031330	total: 29.5s	remaining: 4.11s
251:	learn: 0.0031330	total: 29.6s	remaining: 3.99s
252:	learn: 0.0031330	total: 29.7s	remaining: 3.88s
253:	learn: 0.0031330	total: 29.9s	remaining: 3.76s
254:	learn: 0.0031330	total: 29.9s	remaining: 3.64s
255:	learn: 0.0031330	total: 30s	remaining: 3.52s
256:	learn: 0.0031330	total: 30.1s	remaining: 3.4s
257:	learn: 0.0031330	total: 30.2s	remaining: 3.27s
258:	learn: 0.0031330	total: 30.2s	remaining: 3.15s
259:	learn: 0.0031330	total: 30.3s	remaining: 3.03s
260:	learn: 0.0031241	total: 30.4s	remaining: 2.91s
261:	learn: 0.0031025	total: 30.5s	remaining: 2.79s
262:	learn: 0.0031025	total: 30.6s	remaining: 2.67s
263:	learn: 0.0031025	total: 30.6s	remaining: 2.55s
264:	learn: 0.0031025	total: 30.7s	remaining: 2.43s
265:	learn: 0.0031025	total: 30.8s	remaining: 2.31s
266:	learn: 0.0031025	total: 30.9s	remaining: 2.2s
267:	learn: 0.0031025	total: 31s	remaining: 2.08s
268:	learn: 0.0031025	total: 31.1s	remaining: 1.96s
269:	learn: 0.0030997	total: 31.1s	remaining: 1.84s
270:	learn: 0.0030848	total: 31.2s	remaining: 1.73s
271:	learn: 0.0030660	total: 31.3s	remaining: 1.61s
272:	learn: 0.0030660	total: 31.4s	remaining: 1.49s
273:	learn: 0.0030660	total: 31.4s	remaining: 1.38s
274:	learn: 0.0030521	total: 31.5s	remaining: 1.26s
275:	learn: 0.0030521	total: 31.6s	remaining: 1.15s
276:	learn: 0.0030428	total: 31.7s	remaining: 1.03s
277:	learn: 0.0030103	total: 31.8s	remaining: 916ms
278:	learn: 0.0030103	total: 31.9s	remaining: 801ms
279:	learn: 0.0030067	total: 32s	remaining: 685ms
280:	learn: 0.0029827	total: 32.1s	remaining: 571ms
281:	learn: 0.0029827	total: 32.2s	remaining: 457ms
282:	learn: 0.0029737	total: 32.4s	remaining: 344ms
283:	learn: 0.0029737	total: 32.6s	remaining: 229ms
284:	learn: 0.0029737	total: 32.7s	remaining: 115ms
285:	learn: 0.0029737	total: 32.8s	remaining: 0us
[I 2024-12-19 15:00:41,818] Trial 40 finished with value: 77.2542927985074 and parameters: {'learning_rate': 0.05779768842812921, 'max_depth': 6, 'n_estimators': 286, 'scale_pos_weight': 6.583256314651107}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 28.61
 - Recall_Test: 86.51
 - AUPRC_Test: 75.97
 - Accuracy_Test: 99.61
 - F1-Score_Test: 43.00
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.58
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 77.2543

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5002241	total: 86.4ms	remaining: 23.6s
1:	learn: 0.3354230	total: 180ms	remaining: 24.5s
2:	learn: 0.2436705	total: 270ms	remaining: 24.4s
3:	learn: 0.1842360	total: 386ms	remaining: 26.1s
4:	learn: 0.1414520	total: 477ms	remaining: 25.6s
5:	learn: 0.1094046	total: 574ms	remaining: 25.6s
6:	learn: 0.0905183	total: 689ms	remaining: 26.3s
7:	learn: 0.0792364	total: 774ms	remaining: 25.7s
8:	learn: 0.0704040	total: 869ms	remaining: 25.6s
9:	learn: 0.0631399	total: 1s	remaining: 26.4s
10:	learn: 0.0581116	total: 1.09s	remaining: 26.2s
11:	learn: 0.0533919	total: 1.18s	remaining: 25.8s
12:	learn: 0.0491541	total: 1.3s	remaining: 26.2s
13:	learn: 0.0462895	total: 1.39s	remaining: 25.7s
14:	learn: 0.0435249	total: 1.48s	remaining: 25.5s
15:	learn: 0.0416843	total: 1.61s	remaining: 25.9s
16:	learn: 0.0394308	total: 1.69s	remaining: 25.6s
17:	learn: 0.0378114	total: 1.79s	remaining: 25.4s
18:	learn: 0.0360800	total: 1.9s	remaining: 25.5s
19:	learn: 0.0346302	total: 2s	remaining: 25.5s
20:	learn: 0.0335508	total: 2.09s	remaining: 25.2s
21:	learn: 0.0321909	total: 2.2s	remaining: 25.2s
22:	learn: 0.0309799	total: 2.3s	remaining: 25.1s
23:	learn: 0.0298038	total: 2.39s	remaining: 24.9s
24:	learn: 0.0286638	total: 2.49s	remaining: 24.8s
25:	learn: 0.0277817	total: 2.59s	remaining: 24.7s
26:	learn: 0.0265188	total: 2.68s	remaining: 24.5s
27:	learn: 0.0255424	total: 2.81s	remaining: 24.7s
28:	learn: 0.0249062	total: 2.89s	remaining: 24.4s
29:	learn: 0.0238807	total: 2.99s	remaining: 24.3s
30:	learn: 0.0229658	total: 3.11s	remaining: 24.4s
31:	learn: 0.0220247	total: 3.2s	remaining: 24.2s
32:	learn: 0.0215384	total: 3.28s	remaining: 24s
33:	learn: 0.0208694	total: 3.4s	remaining: 24s
34:	learn: 0.0202274	total: 3.48s	remaining: 23.8s
35:	learn: 0.0195314	total: 3.58s	remaining: 23.7s
36:	learn: 0.0186819	total: 3.71s	remaining: 23.7s
37:	learn: 0.0179018	total: 3.8s	remaining: 23.6s
38:	learn: 0.0173123	total: 3.88s	remaining: 23.4s
39:	learn: 0.0168975	total: 4s	remaining: 23.4s
40:	learn: 0.0163894	total: 4.1s	remaining: 23.3s
41:	learn: 0.0160150	total: 4.19s	remaining: 23.2s
42:	learn: 0.0155770	total: 4.31s	remaining: 23.1s
43:	learn: 0.0150863	total: 4.4s	remaining: 23s
44:	learn: 0.0148681	total: 4.49s	remaining: 22.9s
45:	learn: 0.0145919	total: 4.6s	remaining: 22.8s
46:	learn: 0.0142694	total: 4.69s	remaining: 22.7s
47:	learn: 0.0139262	total: 4.79s	remaining: 22.6s
48:	learn: 0.0137008	total: 4.93s	remaining: 22.7s
49:	learn: 0.0134277	total: 5.09s	remaining: 22.8s
50:	learn: 0.0131766	total: 5.22s	remaining: 22.8s
51:	learn: 0.0129567	total: 5.37s	remaining: 22.9s
52:	learn: 0.0125707	total: 5.57s	remaining: 23.2s
53:	learn: 0.0123137	total: 5.73s	remaining: 23.3s
54:	learn: 0.0119757	total: 5.92s	remaining: 23.6s
55:	learn: 0.0116865	total: 6.1s	remaining: 23.7s
56:	learn: 0.0113412	total: 6.28s	remaining: 23.9s
57:	learn: 0.0111153	total: 6.45s	remaining: 24s
58:	learn: 0.0108584	total: 6.63s	remaining: 24.2s
59:	learn: 0.0106210	total: 6.8s	remaining: 24.2s
60:	learn: 0.0104398	total: 6.97s	remaining: 24.3s
61:	learn: 0.0102746	total: 7.13s	remaining: 24.4s
62:	learn: 0.0100099	total: 7.3s	remaining: 24.5s
63:	learn: 0.0097666	total: 7.48s	remaining: 24.5s
64:	learn: 0.0096015	total: 7.67s	remaining: 24.7s
65:	learn: 0.0094065	total: 7.84s	remaining: 24.7s
66:	learn: 0.0091446	total: 8.03s	remaining: 24.8s
67:	learn: 0.0090035	total: 8.2s	remaining: 24.8s
68:	learn: 0.0087926	total: 8.38s	remaining: 24.9s
69:	learn: 0.0086326	total: 8.55s	remaining: 24.9s
70:	learn: 0.0084750	total: 8.74s	remaining: 25s
71:	learn: 0.0083103	total: 8.9s	remaining: 25s
72:	learn: 0.0081939	total: 9.07s	remaining: 25s
73:	learn: 0.0081034	total: 9.23s	remaining: 25s
74:	learn: 0.0079694	total: 9.41s	remaining: 25s
75:	learn: 0.0078464	total: 9.53s	remaining: 24.8s
76:	learn: 0.0076707	total: 9.7s	remaining: 24.8s
77:	learn: 0.0075009	total: 9.87s	remaining: 24.8s
78:	learn: 0.0072782	total: 10.1s	remaining: 24.8s
79:	learn: 0.0070426	total: 10.2s	remaining: 24.8s
80:	learn: 0.0070269	total: 10.4s	remaining: 24.7s
81:	learn: 0.0068462	total: 10.5s	remaining: 24.7s
82:	learn: 0.0066730	total: 10.7s	remaining: 24.5s
83:	learn: 0.0066390	total: 10.7s	remaining: 24.3s
84:	learn: 0.0065008	total: 10.8s	remaining: 24.1s
85:	learn: 0.0063545	total: 11s	remaining: 24s
86:	learn: 0.0062604	total: 11s	remaining: 23.7s
87:	learn: 0.0061665	total: 11.1s	remaining: 23.5s
88:	learn: 0.0060369	total: 11.3s	remaining: 23.4s
89:	learn: 0.0059781	total: 11.3s	remaining: 23.2s
90:	learn: 0.0058782	total: 11.4s	remaining: 23s
91:	learn: 0.0057634	total: 11.6s	remaining: 22.9s
92:	learn: 0.0056898	total: 11.7s	remaining: 22.7s
93:	learn: 0.0055456	total: 11.7s	remaining: 22.5s
94:	learn: 0.0054596	total: 11.9s	remaining: 22.3s
95:	learn: 0.0053949	total: 11.9s	remaining: 22.1s
96:	learn: 0.0053153	total: 12s	remaining: 22s
97:	learn: 0.0052487	total: 12.1s	remaining: 21.8s
98:	learn: 0.0052246	total: 12.2s	remaining: 21.6s
99:	learn: 0.0051550	total: 12.3s	remaining: 21.4s
100:	learn: 0.0050692	total: 12.5s	remaining: 21.3s
101:	learn: 0.0050034	total: 12.6s	remaining: 21.2s
102:	learn: 0.0049702	total: 12.6s	remaining: 21s
103:	learn: 0.0048952	total: 12.8s	remaining: 20.9s
104:	learn: 0.0048208	total: 12.8s	remaining: 20.7s
105:	learn: 0.0047441	total: 12.9s	remaining: 20.5s
106:	learn: 0.0046859	total: 13s	remaining: 20.3s
107:	learn: 0.0046260	total: 13.1s	remaining: 20.2s
108:	learn: 0.0044991	total: 13.2s	remaining: 20s
109:	learn: 0.0044686	total: 13.3s	remaining: 19.8s
110:	learn: 0.0044011	total: 13.4s	remaining: 19.7s
111:	learn: 0.0043520	total: 13.5s	remaining: 19.5s
112:	learn: 0.0043009	total: 13.6s	remaining: 19.4s
113:	learn: 0.0042313	total: 13.7s	remaining: 19.2s
114:	learn: 0.0041688	total: 13.8s	remaining: 19.1s
115:	learn: 0.0041177	total: 13.9s	remaining: 18.9s
116:	learn: 0.0040548	total: 14s	remaining: 18.8s
117:	learn: 0.0040347	total: 14.1s	remaining: 18.6s
118:	learn: 0.0040254	total: 14.2s	remaining: 18.5s
119:	learn: 0.0039593	total: 14.3s	remaining: 18.3s
120:	learn: 0.0039252	total: 14.3s	remaining: 18.1s
121:	learn: 0.0038852	total: 14.5s	remaining: 18s
122:	learn: 0.0038485	total: 14.6s	remaining: 17.9s
123:	learn: 0.0038165	total: 14.6s	remaining: 17.7s
124:	learn: 0.0038165	total: 14.7s	remaining: 17.6s
125:	learn: 0.0038165	total: 14.8s	remaining: 17.4s
126:	learn: 0.0037520	total: 14.9s	remaining: 17.2s
127:	learn: 0.0037269	total: 15s	remaining: 17.1s
128:	learn: 0.0036855	total: 15.1s	remaining: 17s
129:	learn: 0.0036517	total: 15.2s	remaining: 16.9s
130:	learn: 0.0036147	total: 15.3s	remaining: 16.7s
131:	learn: 0.0036147	total: 15.4s	remaining: 16.6s
132:	learn: 0.0035930	total: 15.5s	remaining: 16.4s
133:	learn: 0.0035500	total: 15.6s	remaining: 16.3s
134:	learn: 0.0035281	total: 15.7s	remaining: 16.2s
135:	learn: 0.0035281	total: 15.8s	remaining: 16s
136:	learn: 0.0035083	total: 15.9s	remaining: 15.9s
137:	learn: 0.0034951	total: 16s	remaining: 15.7s
138:	learn: 0.0034851	total: 16.1s	remaining: 15.6s
139:	learn: 0.0034364	total: 16.1s	remaining: 15.5s
140:	learn: 0.0034364	total: 16.2s	remaining: 15.3s
141:	learn: 0.0033992	total: 16.3s	remaining: 15.2s
142:	learn: 0.0033992	total: 16.4s	remaining: 15s
143:	learn: 0.0033744	total: 16.5s	remaining: 14.9s
144:	learn: 0.0033381	total: 16.6s	remaining: 14.8s
145:	learn: 0.0033050	total: 16.7s	remaining: 14.7s
146:	learn: 0.0032741	total: 16.8s	remaining: 14.5s
147:	learn: 0.0032622	total: 16.9s	remaining: 14.4s
148:	learn: 0.0032500	total: 17s	remaining: 14.3s
149:	learn: 0.0032118	total: 17.1s	remaining: 14.1s
150:	learn: 0.0032118	total: 17.2s	remaining: 14s
151:	learn: 0.0031805	total: 17.3s	remaining: 13.8s
152:	learn: 0.0031805	total: 17.3s	remaining: 13.7s
153:	learn: 0.0031243	total: 17.4s	remaining: 13.6s
154:	learn: 0.0030936	total: 17.5s	remaining: 13.5s
155:	learn: 0.0030936	total: 17.7s	remaining: 13.4s
156:	learn: 0.0030936	total: 17.7s	remaining: 13.2s
157:	learn: 0.0030710	total: 17.8s	remaining: 13.1s
158:	learn: 0.0030413	total: 17.9s	remaining: 13s
159:	learn: 0.0030413	total: 18s	remaining: 12.8s
160:	learn: 0.0030413	total: 18.1s	remaining: 12.7s
161:	learn: 0.0030413	total: 18.2s	remaining: 12.6s
162:	learn: 0.0030413	total: 18.2s	remaining: 12.4s
163:	learn: 0.0030413	total: 18.3s	remaining: 12.3s
164:	learn: 0.0030340	total: 18.4s	remaining: 12.1s
165:	learn: 0.0029928	total: 18.5s	remaining: 12s
166:	learn: 0.0029928	total: 18.6s	remaining: 11.9s
167:	learn: 0.0029928	total: 18.6s	remaining: 11.8s
168:	learn: 0.0029928	total: 18.7s	remaining: 11.6s
169:	learn: 0.0029928	total: 18.8s	remaining: 11.5s
170:	learn: 0.0029928	total: 18.9s	remaining: 11.4s
171:	learn: 0.0029928	total: 18.9s	remaining: 11.2s
172:	learn: 0.0029928	total: 19s	remaining: 11.1s
173:	learn: 0.0029928	total: 19.1s	remaining: 11s
174:	learn: 0.0029927	total: 19.2s	remaining: 10.8s
175:	learn: 0.0029927	total: 19.2s	remaining: 10.7s
176:	learn: 0.0029927	total: 19.3s	remaining: 10.6s
177:	learn: 0.0029927	total: 19.4s	remaining: 10.5s
178:	learn: 0.0029539	total: 19.5s	remaining: 10.4s
179:	learn: 0.0029539	total: 19.6s	remaining: 10.2s
180:	learn: 0.0029539	total: 19.6s	remaining: 10.1s
181:	learn: 0.0029539	total: 19.7s	remaining: 9.98s
182:	learn: 0.0029539	total: 19.8s	remaining: 9.87s
183:	learn: 0.0029539	total: 19.9s	remaining: 9.74s
184:	learn: 0.0029539	total: 20s	remaining: 9.63s
185:	learn: 0.0029539	total: 20.1s	remaining: 9.5s
186:	learn: 0.0029539	total: 20.1s	remaining: 9.37s
187:	learn: 0.0029539	total: 20.2s	remaining: 9.25s
188:	learn: 0.0029539	total: 20.3s	remaining: 9.13s
189:	learn: 0.0029539	total: 20.4s	remaining: 9s
190:	learn: 0.0029539	total: 20.5s	remaining: 8.89s
191:	learn: 0.0029539	total: 20.6s	remaining: 8.79s
192:	learn: 0.0029539	total: 20.7s	remaining: 8.68s
193:	learn: 0.0029046	total: 20.9s	remaining: 8.6s
194:	learn: 0.0029046	total: 21s	remaining: 8.5s
195:	learn: 0.0029046	total: 21.1s	remaining: 8.41s
196:	learn: 0.0029046	total: 21.3s	remaining: 8.31s
197:	learn: 0.0029046	total: 21.4s	remaining: 8.21s
198:	learn: 0.0029045	total: 21.6s	remaining: 8.12s
199:	learn: 0.0029045	total: 21.7s	remaining: 8.02s
200:	learn: 0.0029045	total: 21.8s	remaining: 7.93s
201:	learn: 0.0029045	total: 22s	remaining: 7.84s
202:	learn: 0.0029045	total: 22.1s	remaining: 7.74s
203:	learn: 0.0029045	total: 22.3s	remaining: 7.64s
204:	learn: 0.0029045	total: 22.4s	remaining: 7.54s
205:	learn: 0.0029045	total: 22.5s	remaining: 7.44s
206:	learn: 0.0029045	total: 22.7s	remaining: 7.34s
207:	learn: 0.0029045	total: 22.8s	remaining: 7.24s
208:	learn: 0.0029045	total: 23s	remaining: 7.16s
209:	learn: 0.0029045	total: 23.1s	remaining: 7.05s
210:	learn: 0.0028976	total: 23.3s	remaining: 6.96s
211:	learn: 0.0028976	total: 23.4s	remaining: 6.84s
212:	learn: 0.0028976	total: 23.6s	remaining: 6.75s
213:	learn: 0.0028976	total: 23.7s	remaining: 6.64s
214:	learn: 0.0028770	total: 23.9s	remaining: 6.55s
215:	learn: 0.0028769	total: 24s	remaining: 6.45s
216:	learn: 0.0028769	total: 24.2s	remaining: 6.35s
217:	learn: 0.0028769	total: 24.3s	remaining: 6.25s
218:	learn: 0.0028769	total: 24.5s	remaining: 6.15s
219:	learn: 0.0028769	total: 24.6s	remaining: 6.04s
220:	learn: 0.0028770	total: 24.8s	remaining: 5.94s
221:	learn: 0.0028769	total: 24.9s	remaining: 5.83s
222:	learn: 0.0028769	total: 25s	remaining: 5.72s
223:	learn: 0.0028769	total: 25.2s	remaining: 5.62s
224:	learn: 0.0028769	total: 25.3s	remaining: 5.51s
225:	learn: 0.0028769	total: 25.4s	remaining: 5.4s
226:	learn: 0.0028769	total: 25.6s	remaining: 5.3s
227:	learn: 0.0028769	total: 25.7s	remaining: 5.19s
228:	learn: 0.0028769	total: 25.9s	remaining: 5.09s
229:	learn: 0.0028769	total: 26s	remaining: 4.98s
230:	learn: 0.0028769	total: 26.2s	remaining: 4.87s
231:	learn: 0.0028769	total: 26.3s	remaining: 4.76s
232:	learn: 0.0028769	total: 26.5s	remaining: 4.66s
233:	learn: 0.0028769	total: 26.5s	remaining: 4.54s
234:	learn: 0.0028769	total: 26.6s	remaining: 4.41s
235:	learn: 0.0028768	total: 26.7s	remaining: 4.29s
236:	learn: 0.0028768	total: 26.8s	remaining: 4.18s
237:	learn: 0.0028768	total: 26.8s	remaining: 4.06s
238:	learn: 0.0028768	total: 26.9s	remaining: 3.94s
239:	learn: 0.0028768	total: 27s	remaining: 3.83s
240:	learn: 0.0028768	total: 27.1s	remaining: 3.71s
241:	learn: 0.0028768	total: 27.2s	remaining: 3.6s
242:	learn: 0.0028768	total: 27.3s	remaining: 3.48s
243:	learn: 0.0028768	total: 27.4s	remaining: 3.36s
244:	learn: 0.0028768	total: 27.4s	remaining: 3.25s
245:	learn: 0.0028768	total: 27.5s	remaining: 3.13s
246:	learn: 0.0028768	total: 27.6s	remaining: 3.02s
247:	learn: 0.0028768	total: 27.7s	remaining: 2.9s
248:	learn: 0.0028768	total: 27.8s	remaining: 2.79s
249:	learn: 0.0028768	total: 27.8s	remaining: 2.67s
250:	learn: 0.0028768	total: 27.9s	remaining: 2.56s
251:	learn: 0.0028768	total: 28s	remaining: 2.45s
252:	learn: 0.0028768	total: 28.1s	remaining: 2.33s
253:	learn: 0.0028768	total: 28.2s	remaining: 2.22s
254:	learn: 0.0028768	total: 28.3s	remaining: 2.11s
255:	learn: 0.0028768	total: 28.4s	remaining: 1.99s
256:	learn: 0.0028768	total: 28.4s	remaining: 1.88s
257:	learn: 0.0028768	total: 28.5s	remaining: 1.77s
258:	learn: 0.0028768	total: 28.6s	remaining: 1.66s
259:	learn: 0.0028768	total: 28.7s	remaining: 1.54s
260:	learn: 0.0028767	total: 28.8s	remaining: 1.43s
261:	learn: 0.0028767	total: 28.8s	remaining: 1.32s
262:	learn: 0.0028767	total: 28.9s	remaining: 1.21s
263:	learn: 0.0028767	total: 29s	remaining: 1.1s
264:	learn: 0.0028767	total: 29.1s	remaining: 987ms
265:	learn: 0.0028767	total: 29.2s	remaining: 877ms
266:	learn: 0.0028767	total: 29.2s	remaining: 767ms
267:	learn: 0.0028767	total: 29.3s	remaining: 656ms
268:	learn: 0.0028767	total: 29.4s	remaining: 547ms
269:	learn: 0.0028767	total: 29.5s	remaining: 437ms
270:	learn: 0.0028767	total: 29.6s	remaining: 327ms
271:	learn: 0.0028767	total: 29.7s	remaining: 218ms
272:	learn: 0.0028767	total: 29.7s	remaining: 109ms
273:	learn: 0.0028766	total: 29.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.86
 - Precision_Test: 26.47
 - Recall_Test: 85.71
 - AUPRC_Test: 79.46
 - Accuracy_Test: 99.58
 - F1-Score_Test: 40.45
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 274
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4954224	total: 91.2ms	remaining: 24.9s
1:	learn: 0.3598957	total: 180ms	remaining: 24.4s
2:	learn: 0.2775307	total: 281ms	remaining: 25.4s
3:	learn: 0.2189195	total: 399ms	remaining: 26.9s
4:	learn: 0.1805228	total: 482ms	remaining: 25.9s
5:	learn: 0.1479951	total: 577ms	remaining: 25.8s
6:	learn: 0.1305945	total: 679ms	remaining: 25.9s
7:	learn: 0.1182944	total: 769ms	remaining: 25.6s
8:	learn: 0.1083682	total: 856ms	remaining: 25.2s
9:	learn: 0.1001558	total: 967ms	remaining: 25.5s
10:	learn: 0.0913600	total: 1.05s	remaining: 25.2s
11:	learn: 0.0831605	total: 1.15s	remaining: 25.1s
12:	learn: 0.0766955	total: 1.28s	remaining: 25.8s
13:	learn: 0.0710016	total: 1.37s	remaining: 25.5s
14:	learn: 0.0671038	total: 1.47s	remaining: 25.4s
15:	learn: 0.0630553	total: 1.58s	remaining: 25.5s
16:	learn: 0.0594376	total: 1.67s	remaining: 25.3s
17:	learn: 0.0554457	total: 1.76s	remaining: 25.1s
18:	learn: 0.0527352	total: 1.92s	remaining: 25.7s
19:	learn: 0.0495424	total: 2s	remaining: 25.5s
20:	learn: 0.0475822	total: 2.09s	remaining: 25.2s
21:	learn: 0.0454740	total: 2.21s	remaining: 25.4s
22:	learn: 0.0435979	total: 2.32s	remaining: 25.3s
23:	learn: 0.0417729	total: 2.42s	remaining: 25.2s
24:	learn: 0.0402940	total: 2.53s	remaining: 25.2s
25:	learn: 0.0391200	total: 2.62s	remaining: 25s
26:	learn: 0.0373681	total: 2.74s	remaining: 25s
27:	learn: 0.0363208	total: 2.82s	remaining: 24.8s
28:	learn: 0.0350933	total: 2.94s	remaining: 24.9s
29:	learn: 0.0341171	total: 3.03s	remaining: 24.6s
30:	learn: 0.0328627	total: 3.12s	remaining: 24.4s
31:	learn: 0.0319278	total: 3.23s	remaining: 24.5s
32:	learn: 0.0305781	total: 3.35s	remaining: 24.4s
33:	learn: 0.0298600	total: 3.45s	remaining: 24.4s
34:	learn: 0.0287612	total: 3.55s	remaining: 24.2s
35:	learn: 0.0280201	total: 3.64s	remaining: 24s
36:	learn: 0.0273920	total: 3.75s	remaining: 24s
37:	learn: 0.0266962	total: 3.84s	remaining: 23.9s
38:	learn: 0.0259871	total: 3.93s	remaining: 23.7s
39:	learn: 0.0253856	total: 4.04s	remaining: 23.6s
40:	learn: 0.0248491	total: 4.12s	remaining: 23.4s
41:	learn: 0.0238703	total: 4.21s	remaining: 23.3s
42:	learn: 0.0231222	total: 4.32s	remaining: 23.2s
43:	learn: 0.0225641	total: 4.46s	remaining: 23.3s
44:	learn: 0.0218908	total: 4.62s	remaining: 23.5s
45:	learn: 0.0214395	total: 4.78s	remaining: 23.7s
46:	learn: 0.0208903	total: 4.95s	remaining: 23.9s
47:	learn: 0.0205598	total: 5.1s	remaining: 24s
48:	learn: 0.0200014	total: 5.3s	remaining: 24.3s
49:	learn: 0.0191814	total: 5.48s	remaining: 24.6s
50:	learn: 0.0185537	total: 5.67s	remaining: 24.8s
51:	learn: 0.0179077	total: 5.85s	remaining: 25s
52:	learn: 0.0174275	total: 6.03s	remaining: 25.1s
53:	learn: 0.0170804	total: 6.18s	remaining: 25.2s
54:	learn: 0.0165912	total: 6.37s	remaining: 25.4s
55:	learn: 0.0164404	total: 6.53s	remaining: 25.4s
56:	learn: 0.0160982	total: 6.69s	remaining: 25.5s
57:	learn: 0.0157235	total: 6.85s	remaining: 25.5s
58:	learn: 0.0154917	total: 7.04s	remaining: 25.7s
59:	learn: 0.0149960	total: 7.21s	remaining: 25.7s
60:	learn: 0.0146889	total: 7.39s	remaining: 25.8s
61:	learn: 0.0143202	total: 7.57s	remaining: 25.9s
62:	learn: 0.0139173	total: 7.76s	remaining: 26s
63:	learn: 0.0135187	total: 7.95s	remaining: 26.1s
64:	learn: 0.0131898	total: 8.13s	remaining: 26.1s
65:	learn: 0.0129416	total: 8.29s	remaining: 26.1s
66:	learn: 0.0127633	total: 8.46s	remaining: 26.1s
67:	learn: 0.0124899	total: 8.63s	remaining: 26.2s
68:	learn: 0.0121349	total: 8.82s	remaining: 26.2s
69:	learn: 0.0117793	total: 9s	remaining: 26.2s
70:	learn: 0.0116679	total: 9.17s	remaining: 26.2s
71:	learn: 0.0114256	total: 9.35s	remaining: 26.2s
72:	learn: 0.0112281	total: 9.51s	remaining: 26.2s
73:	learn: 0.0110231	total: 9.7s	remaining: 26.2s
74:	learn: 0.0108905	total: 9.88s	remaining: 26.2s
75:	learn: 0.0106795	total: 10s	remaining: 26.2s
76:	learn: 0.0104768	total: 10.2s	remaining: 26.1s
77:	learn: 0.0103257	total: 10.3s	remaining: 26s
78:	learn: 0.0102235	total: 10.4s	remaining: 25.8s
79:	learn: 0.0099476	total: 10.5s	remaining: 25.6s
80:	learn: 0.0097523	total: 10.6s	remaining: 25.3s
81:	learn: 0.0095984	total: 10.8s	remaining: 25.2s
82:	learn: 0.0094698	total: 10.8s	remaining: 25s
83:	learn: 0.0092863	total: 10.9s	remaining: 24.7s
84:	learn: 0.0091818	total: 11s	remaining: 24.5s
85:	learn: 0.0090695	total: 11.1s	remaining: 24.3s
86:	learn: 0.0088703	total: 11.2s	remaining: 24.1s
87:	learn: 0.0088289	total: 11.4s	remaining: 24s
88:	learn: 0.0086980	total: 11.4s	remaining: 23.8s
89:	learn: 0.0086145	total: 11.5s	remaining: 23.6s
90:	learn: 0.0085693	total: 11.6s	remaining: 23.4s
91:	learn: 0.0083836	total: 11.7s	remaining: 23.2s
92:	learn: 0.0083418	total: 11.8s	remaining: 23s
93:	learn: 0.0081875	total: 11.9s	remaining: 22.8s
94:	learn: 0.0080384	total: 12s	remaining: 22.6s
95:	learn: 0.0079558	total: 12.1s	remaining: 22.4s
96:	learn: 0.0078927	total: 12.2s	remaining: 22.3s
97:	learn: 0.0077801	total: 12.3s	remaining: 22.1s
98:	learn: 0.0076772	total: 12.4s	remaining: 21.9s
99:	learn: 0.0075757	total: 12.5s	remaining: 21.7s
100:	learn: 0.0074848	total: 12.6s	remaining: 21.5s
101:	learn: 0.0073229	total: 12.7s	remaining: 21.4s
102:	learn: 0.0072370	total: 12.8s	remaining: 21.2s
103:	learn: 0.0071457	total: 12.9s	remaining: 21.1s
104:	learn: 0.0071225	total: 13s	remaining: 20.9s
105:	learn: 0.0070090	total: 13.1s	remaining: 20.8s
106:	learn: 0.0068960	total: 13.2s	remaining: 20.6s
107:	learn: 0.0068553	total: 13.3s	remaining: 20.4s
108:	learn: 0.0067957	total: 13.4s	remaining: 20.3s
109:	learn: 0.0067282	total: 13.5s	remaining: 20.1s
110:	learn: 0.0065676	total: 13.6s	remaining: 19.9s
111:	learn: 0.0065130	total: 13.7s	remaining: 19.8s
112:	learn: 0.0064284	total: 13.8s	remaining: 19.6s
113:	learn: 0.0062915	total: 13.9s	remaining: 19.5s
114:	learn: 0.0062621	total: 14s	remaining: 19.3s
115:	learn: 0.0061753	total: 14.1s	remaining: 19.2s
116:	learn: 0.0061172	total: 14.1s	remaining: 19s
117:	learn: 0.0060119	total: 14.3s	remaining: 18.8s
118:	learn: 0.0059725	total: 14.3s	remaining: 18.7s
119:	learn: 0.0059021	total: 14.4s	remaining: 18.5s
120:	learn: 0.0058188	total: 14.5s	remaining: 18.4s
121:	learn: 0.0057790	total: 14.6s	remaining: 18.2s
122:	learn: 0.0057197	total: 14.7s	remaining: 18s
123:	learn: 0.0056500	total: 14.8s	remaining: 17.9s
124:	learn: 0.0055604	total: 14.9s	remaining: 17.8s
125:	learn: 0.0055095	total: 15s	remaining: 17.6s
126:	learn: 0.0054595	total: 15.1s	remaining: 17.5s
127:	learn: 0.0053627	total: 15.2s	remaining: 17.3s
128:	learn: 0.0053109	total: 15.3s	remaining: 17.2s
129:	learn: 0.0052988	total: 15.4s	remaining: 17.1s
130:	learn: 0.0051789	total: 15.5s	remaining: 16.9s
131:	learn: 0.0051117	total: 15.6s	remaining: 16.8s
132:	learn: 0.0050670	total: 15.7s	remaining: 16.6s
133:	learn: 0.0049965	total: 15.8s	remaining: 16.5s
134:	learn: 0.0049354	total: 15.9s	remaining: 16.4s
135:	learn: 0.0048871	total: 16s	remaining: 16.3s
136:	learn: 0.0048264	total: 16.1s	remaining: 16.1s
137:	learn: 0.0047938	total: 16.2s	remaining: 16s
138:	learn: 0.0047478	total: 16.3s	remaining: 15.8s
139:	learn: 0.0046971	total: 16.4s	remaining: 15.7s
140:	learn: 0.0046323	total: 16.5s	remaining: 15.6s
141:	learn: 0.0045486	total: 16.6s	remaining: 15.4s
142:	learn: 0.0045107	total: 16.7s	remaining: 15.3s
143:	learn: 0.0044783	total: 16.8s	remaining: 15.2s
144:	learn: 0.0044078	total: 16.9s	remaining: 15s
145:	learn: 0.0043515	total: 17s	remaining: 14.9s
146:	learn: 0.0042762	total: 17.1s	remaining: 14.8s
147:	learn: 0.0042168	total: 17.2s	remaining: 14.6s
148:	learn: 0.0041836	total: 17.3s	remaining: 14.5s
149:	learn: 0.0041710	total: 17.4s	remaining: 14.4s
150:	learn: 0.0041307	total: 17.5s	remaining: 14.2s
151:	learn: 0.0041160	total: 17.5s	remaining: 14.1s
152:	learn: 0.0040698	total: 17.6s	remaining: 14s
153:	learn: 0.0040407	total: 17.7s	remaining: 13.8s
154:	learn: 0.0040115	total: 17.8s	remaining: 13.7s
155:	learn: 0.0039986	total: 17.9s	remaining: 13.6s
156:	learn: 0.0039740	total: 18s	remaining: 13.4s
157:	learn: 0.0039481	total: 18.1s	remaining: 13.3s
158:	learn: 0.0038916	total: 18.2s	remaining: 13.2s
159:	learn: 0.0038401	total: 18.3s	remaining: 13s
160:	learn: 0.0038115	total: 18.4s	remaining: 12.9s
161:	learn: 0.0037793	total: 18.5s	remaining: 12.8s
162:	learn: 0.0037793	total: 18.6s	remaining: 12.6s
163:	learn: 0.0037527	total: 18.6s	remaining: 12.5s
164:	learn: 0.0036971	total: 18.8s	remaining: 12.4s
165:	learn: 0.0036971	total: 18.8s	remaining: 12.3s
166:	learn: 0.0036971	total: 18.9s	remaining: 12.1s
167:	learn: 0.0036934	total: 19s	remaining: 12s
168:	learn: 0.0036702	total: 19.1s	remaining: 11.9s
169:	learn: 0.0036308	total: 19.2s	remaining: 11.7s
170:	learn: 0.0036107	total: 19.3s	remaining: 11.6s
171:	learn: 0.0035900	total: 19.4s	remaining: 11.5s
172:	learn: 0.0035900	total: 19.5s	remaining: 11.4s
173:	learn: 0.0035900	total: 19.6s	remaining: 11.2s
174:	learn: 0.0035900	total: 19.6s	remaining: 11.1s
175:	learn: 0.0035899	total: 19.7s	remaining: 11s
176:	learn: 0.0035739	total: 19.8s	remaining: 10.8s
177:	learn: 0.0035225	total: 19.9s	remaining: 10.7s
178:	learn: 0.0034852	total: 20.1s	remaining: 10.6s
179:	learn: 0.0034852	total: 20.1s	remaining: 10.5s
180:	learn: 0.0034852	total: 20.2s	remaining: 10.4s
181:	learn: 0.0034665	total: 20.3s	remaining: 10.3s
182:	learn: 0.0034423	total: 20.4s	remaining: 10.2s
183:	learn: 0.0034130	total: 20.6s	remaining: 10.1s
184:	learn: 0.0033652	total: 20.7s	remaining: 9.98s
185:	learn: 0.0033527	total: 20.9s	remaining: 9.88s
186:	learn: 0.0033111	total: 21.1s	remaining: 9.8s
187:	learn: 0.0033111	total: 21.2s	remaining: 9.71s
188:	learn: 0.0033111	total: 21.4s	remaining: 9.6s
189:	learn: 0.0033110	total: 21.5s	remaining: 9.5s
190:	learn: 0.0033110	total: 21.6s	remaining: 9.41s
191:	learn: 0.0032936	total: 21.8s	remaining: 9.32s
192:	learn: 0.0032936	total: 22s	remaining: 9.22s
193:	learn: 0.0032936	total: 22.1s	remaining: 9.12s
194:	learn: 0.0032936	total: 22.2s	remaining: 9.01s
195:	learn: 0.0032936	total: 22.4s	remaining: 8.9s
196:	learn: 0.0032936	total: 22.5s	remaining: 8.8s
197:	learn: 0.0032936	total: 22.7s	remaining: 8.7s
198:	learn: 0.0032936	total: 22.8s	remaining: 8.59s
199:	learn: 0.0032936	total: 22.9s	remaining: 8.48s
200:	learn: 0.0032936	total: 23.1s	remaining: 8.38s
201:	learn: 0.0032936	total: 23.2s	remaining: 8.27s
202:	learn: 0.0032936	total: 23.3s	remaining: 8.15s
203:	learn: 0.0032935	total: 23.5s	remaining: 8.05s
204:	learn: 0.0032935	total: 23.6s	remaining: 7.94s
205:	learn: 0.0032935	total: 23.7s	remaining: 7.83s
206:	learn: 0.0032935	total: 23.9s	remaining: 7.73s
207:	learn: 0.0032935	total: 24s	remaining: 7.62s
208:	learn: 0.0032791	total: 24.2s	remaining: 7.51s
209:	learn: 0.0032791	total: 24.3s	remaining: 7.41s
210:	learn: 0.0032791	total: 24.5s	remaining: 7.3s
211:	learn: 0.0032791	total: 24.6s	remaining: 7.19s
212:	learn: 0.0032791	total: 24.7s	remaining: 7.08s
213:	learn: 0.0032791	total: 24.9s	remaining: 6.98s
214:	learn: 0.0032791	total: 25s	remaining: 6.87s
215:	learn: 0.0032791	total: 25.2s	remaining: 6.75s
216:	learn: 0.0032791	total: 25.3s	remaining: 6.64s
217:	learn: 0.0032791	total: 25.4s	remaining: 6.53s
218:	learn: 0.0032791	total: 25.6s	remaining: 6.42s
219:	learn: 0.0032627	total: 25.7s	remaining: 6.31s
220:	learn: 0.0032627	total: 25.8s	remaining: 6.2s
221:	learn: 0.0032627	total: 26s	remaining: 6.09s
222:	learn: 0.0032627	total: 26.1s	remaining: 5.97s
223:	learn: 0.0032627	total: 26.2s	remaining: 5.86s
224:	learn: 0.0032627	total: 26.4s	remaining: 5.74s
225:	learn: 0.0032627	total: 26.5s	remaining: 5.62s
226:	learn: 0.0032627	total: 26.6s	remaining: 5.5s
227:	learn: 0.0032627	total: 26.6s	remaining: 5.37s
228:	learn: 0.0032627	total: 26.7s	remaining: 5.25s
229:	learn: 0.0032627	total: 26.8s	remaining: 5.12s
230:	learn: 0.0032627	total: 26.8s	remaining: 5s
231:	learn: 0.0032627	total: 26.9s	remaining: 4.87s
232:	learn: 0.0032627	total: 27s	remaining: 4.75s
233:	learn: 0.0032627	total: 27.1s	remaining: 4.63s
234:	learn: 0.0032627	total: 27.1s	remaining: 4.5s
235:	learn: 0.0032627	total: 27.2s	remaining: 4.38s
236:	learn: 0.0032627	total: 27.3s	remaining: 4.26s
237:	learn: 0.0032627	total: 27.4s	remaining: 4.15s
238:	learn: 0.0032627	total: 27.5s	remaining: 4.02s
239:	learn: 0.0032627	total: 27.5s	remaining: 3.9s
240:	learn: 0.0032627	total: 27.6s	remaining: 3.78s
241:	learn: 0.0032627	total: 27.7s	remaining: 3.66s
242:	learn: 0.0032627	total: 27.8s	remaining: 3.54s
243:	learn: 0.0032627	total: 27.8s	remaining: 3.42s
244:	learn: 0.0032627	total: 27.9s	remaining: 3.3s
245:	learn: 0.0032627	total: 28s	remaining: 3.19s
246:	learn: 0.0032627	total: 28.1s	remaining: 3.07s
247:	learn: 0.0032627	total: 28.2s	remaining: 2.95s
248:	learn: 0.0032627	total: 28.2s	remaining: 2.83s
249:	learn: 0.0032627	total: 28.3s	remaining: 2.72s
250:	learn: 0.0032627	total: 28.4s	remaining: 2.6s
251:	learn: 0.0032627	total: 28.5s	remaining: 2.48s
252:	learn: 0.0032627	total: 28.6s	remaining: 2.37s
253:	learn: 0.0032626	total: 28.6s	remaining: 2.25s
254:	learn: 0.0032626	total: 28.7s	remaining: 2.14s
255:	learn: 0.0032626	total: 28.8s	remaining: 2.02s
256:	learn: 0.0032626	total: 28.9s	remaining: 1.91s
257:	learn: 0.0032626	total: 28.9s	remaining: 1.79s
258:	learn: 0.0032626	total: 29s	remaining: 1.68s
259:	learn: 0.0032626	total: 29.1s	remaining: 1.56s
260:	learn: 0.0032626	total: 29.1s	remaining: 1.45s
261:	learn: 0.0032626	total: 29.2s	remaining: 1.34s
262:	learn: 0.0032626	total: 29.3s	remaining: 1.23s
263:	learn: 0.0032626	total: 29.4s	remaining: 1.11s
264:	learn: 0.0032626	total: 29.5s	remaining: 1s
265:	learn: 0.0032626	total: 29.6s	remaining: 889ms
266:	learn: 0.0032626	total: 29.6s	remaining: 777ms
267:	learn: 0.0032626	total: 29.7s	remaining: 665ms
268:	learn: 0.0032626	total: 29.8s	remaining: 554ms
269:	learn: 0.0032626	total: 29.9s	remaining: 442ms
270:	learn: 0.0032625	total: 29.9s	remaining: 331ms
271:	learn: 0.0032626	total: 30s	remaining: 221ms
272:	learn: 0.0032626	total: 30.1s	remaining: 110ms
273:	learn: 0.0032625	total: 30.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.70
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 30.43
 - Recall_Test: 88.89
 - AUPRC_Test: 77.54
 - Accuracy_Test: 99.64
 - F1-Score_Test: 45.34
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 274
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5033207	total: 137ms	remaining: 37.3s
1:	learn: 0.3451872	total: 269ms	remaining: 36.5s
2:	learn: 0.2444421	total: 428ms	remaining: 38.6s
3:	learn: 0.1972065	total: 596ms	remaining: 40.2s
4:	learn: 0.1612259	total: 772ms	remaining: 41.6s
5:	learn: 0.1291969	total: 948ms	remaining: 42.4s
6:	learn: 0.1141496	total: 1.11s	remaining: 42.3s
7:	learn: 0.1013451	total: 1.24s	remaining: 41.3s
8:	learn: 0.0896665	total: 1.45s	remaining: 42.6s
9:	learn: 0.0803120	total: 1.63s	remaining: 43s
10:	learn: 0.0749931	total: 1.81s	remaining: 43.3s
11:	learn: 0.0701446	total: 1.96s	remaining: 42.7s
12:	learn: 0.0666759	total: 2.13s	remaining: 42.8s
13:	learn: 0.0607978	total: 2.31s	remaining: 42.9s
14:	learn: 0.0576885	total: 2.48s	remaining: 42.9s
15:	learn: 0.0542176	total: 2.67s	remaining: 43.1s
16:	learn: 0.0513641	total: 2.85s	remaining: 43s
17:	learn: 0.0490971	total: 3.01s	remaining: 42.8s
18:	learn: 0.0467470	total: 3.19s	remaining: 42.9s
19:	learn: 0.0445390	total: 3.36s	remaining: 42.7s
20:	learn: 0.0421199	total: 3.56s	remaining: 42.9s
21:	learn: 0.0399374	total: 3.76s	remaining: 43s
22:	learn: 0.0378166	total: 3.96s	remaining: 43.3s
23:	learn: 0.0363193	total: 4.16s	remaining: 43.3s
24:	learn: 0.0351116	total: 4.36s	remaining: 43.4s
25:	learn: 0.0344708	total: 4.55s	remaining: 43.4s
26:	learn: 0.0333725	total: 4.76s	remaining: 43.5s
27:	learn: 0.0323595	total: 4.96s	remaining: 43.6s
28:	learn: 0.0312108	total: 5.14s	remaining: 43.4s
29:	learn: 0.0306478	total: 5.31s	remaining: 43.2s
30:	learn: 0.0298049	total: 5.48s	remaining: 43s
31:	learn: 0.0292554	total: 5.65s	remaining: 42.7s
32:	learn: 0.0282494	total: 5.82s	remaining: 42.5s
33:	learn: 0.0269292	total: 6.01s	remaining: 42.4s
34:	learn: 0.0259773	total: 6.25s	remaining: 42.7s
35:	learn: 0.0250145	total: 6.43s	remaining: 42.5s
36:	learn: 0.0242237	total: 6.62s	remaining: 42.4s
37:	learn: 0.0234526	total: 6.79s	remaining: 42.2s
38:	learn: 0.0229954	total: 6.98s	remaining: 42s
39:	learn: 0.0223756	total: 7.15s	remaining: 41.9s
40:	learn: 0.0219915	total: 7.33s	remaining: 41.7s
41:	learn: 0.0214249	total: 7.52s	remaining: 41.6s
42:	learn: 0.0210483	total: 7.71s	remaining: 41.4s
43:	learn: 0.0202858	total: 7.88s	remaining: 41.2s
44:	learn: 0.0195951	total: 8.06s	remaining: 41s
45:	learn: 0.0190821	total: 8.24s	remaining: 40.8s
46:	learn: 0.0184624	total: 8.41s	remaining: 40.6s
47:	learn: 0.0180472	total: 8.59s	remaining: 40.4s
48:	learn: 0.0177340	total: 8.72s	remaining: 40.1s
49:	learn: 0.0172975	total: 8.88s	remaining: 39.8s
50:	learn: 0.0166428	total: 9.1s	remaining: 39.8s
51:	learn: 0.0163868	total: 9.24s	remaining: 39.5s
52:	learn: 0.0158961	total: 9.41s	remaining: 39.2s
53:	learn: 0.0153116	total: 9.59s	remaining: 39.1s
54:	learn: 0.0150215	total: 9.79s	remaining: 39s
55:	learn: 0.0148640	total: 9.93s	remaining: 38.6s
56:	learn: 0.0143113	total: 10.1s	remaining: 38.6s
57:	learn: 0.0137813	total: 10.3s	remaining: 38.4s
58:	learn: 0.0136591	total: 10.5s	remaining: 38.2s
59:	learn: 0.0133197	total: 10.7s	remaining: 38s
60:	learn: 0.0130642	total: 10.8s	remaining: 37.8s
61:	learn: 0.0128754	total: 11s	remaining: 37.6s
62:	learn: 0.0126994	total: 11.2s	remaining: 37.4s
63:	learn: 0.0125367	total: 11.3s	remaining: 36.9s
64:	learn: 0.0120887	total: 11.4s	remaining: 36.5s
65:	learn: 0.0117952	total: 11.5s	remaining: 36.1s
66:	learn: 0.0114626	total: 11.6s	remaining: 35.7s
67:	learn: 0.0112575	total: 11.6s	remaining: 35.3s
68:	learn: 0.0109389	total: 11.8s	remaining: 35s
69:	learn: 0.0107209	total: 11.9s	remaining: 34.6s
70:	learn: 0.0105197	total: 12s	remaining: 34.2s
71:	learn: 0.0104494	total: 12.1s	remaining: 34s
72:	learn: 0.0101902	total: 12.2s	remaining: 33.6s
73:	learn: 0.0100536	total: 12.3s	remaining: 33.2s
74:	learn: 0.0099203	total: 12.4s	remaining: 32.9s
75:	learn: 0.0097522	total: 12.5s	remaining: 32.5s
76:	learn: 0.0095654	total: 12.6s	remaining: 32.2s
77:	learn: 0.0093460	total: 12.7s	remaining: 31.9s
78:	learn: 0.0091516	total: 12.8s	remaining: 31.6s
79:	learn: 0.0089626	total: 12.9s	remaining: 31.3s
80:	learn: 0.0088261	total: 13s	remaining: 31s
81:	learn: 0.0086384	total: 13.1s	remaining: 30.7s
82:	learn: 0.0085120	total: 13.2s	remaining: 30.4s
83:	learn: 0.0084604	total: 13.3s	remaining: 30s
84:	learn: 0.0082631	total: 13.4s	remaining: 29.7s
85:	learn: 0.0082102	total: 13.5s	remaining: 29.5s
86:	learn: 0.0081096	total: 13.6s	remaining: 29.2s
87:	learn: 0.0079164	total: 13.7s	remaining: 28.9s
88:	learn: 0.0077753	total: 13.8s	remaining: 28.6s
89:	learn: 0.0076553	total: 13.9s	remaining: 28.4s
90:	learn: 0.0075303	total: 14s	remaining: 28.1s
91:	learn: 0.0074323	total: 14.1s	remaining: 27.9s
92:	learn: 0.0073453	total: 14.2s	remaining: 27.6s
93:	learn: 0.0071898	total: 14.3s	remaining: 27.3s
94:	learn: 0.0070618	total: 14.4s	remaining: 27.1s
95:	learn: 0.0069173	total: 14.5s	remaining: 26.8s
96:	learn: 0.0067682	total: 14.6s	remaining: 26.6s
97:	learn: 0.0066674	total: 14.7s	remaining: 26.3s
98:	learn: 0.0065249	total: 14.8s	remaining: 26.1s
99:	learn: 0.0063726	total: 14.8s	remaining: 25.8s
100:	learn: 0.0062860	total: 15s	remaining: 25.7s
101:	learn: 0.0062246	total: 15.1s	remaining: 25.4s
102:	learn: 0.0061249	total: 15.1s	remaining: 25.1s
103:	learn: 0.0060307	total: 15.3s	remaining: 24.9s
104:	learn: 0.0059216	total: 15.3s	remaining: 24.7s
105:	learn: 0.0058858	total: 15.4s	remaining: 24.5s
106:	learn: 0.0057608	total: 15.5s	remaining: 24.3s
107:	learn: 0.0057142	total: 15.6s	remaining: 24s
108:	learn: 0.0056571	total: 15.7s	remaining: 23.8s
109:	learn: 0.0056038	total: 15.8s	remaining: 23.6s
110:	learn: 0.0055213	total: 15.9s	remaining: 23.4s
111:	learn: 0.0054437	total: 16s	remaining: 23.1s
112:	learn: 0.0053897	total: 16.1s	remaining: 23s
113:	learn: 0.0053563	total: 16.2s	remaining: 22.7s
114:	learn: 0.0053214	total: 16.3s	remaining: 22.5s
115:	learn: 0.0052785	total: 16.4s	remaining: 22.3s
116:	learn: 0.0051727	total: 16.5s	remaining: 22.1s
117:	learn: 0.0051008	total: 16.6s	remaining: 22s
118:	learn: 0.0050034	total: 16.7s	remaining: 21.8s
119:	learn: 0.0049325	total: 16.8s	remaining: 21.6s
120:	learn: 0.0048577	total: 16.9s	remaining: 21.4s
121:	learn: 0.0047717	total: 17s	remaining: 21.2s
122:	learn: 0.0047186	total: 17.1s	remaining: 21s
123:	learn: 0.0046402	total: 17.2s	remaining: 20.8s
124:	learn: 0.0045753	total: 17.3s	remaining: 20.6s
125:	learn: 0.0045379	total: 17.4s	remaining: 20.4s
126:	learn: 0.0044892	total: 17.5s	remaining: 20.3s
127:	learn: 0.0044370	total: 17.6s	remaining: 20.1s
128:	learn: 0.0044095	total: 17.7s	remaining: 19.8s
129:	learn: 0.0043901	total: 17.8s	remaining: 19.7s
130:	learn: 0.0043204	total: 17.9s	remaining: 19.5s
131:	learn: 0.0043035	total: 17.9s	remaining: 19.3s
132:	learn: 0.0042752	total: 18s	remaining: 19.1s
133:	learn: 0.0042657	total: 18.1s	remaining: 18.9s
134:	learn: 0.0042431	total: 18.2s	remaining: 18.7s
135:	learn: 0.0042175	total: 18.3s	remaining: 18.6s
136:	learn: 0.0041865	total: 18.4s	remaining: 18.4s
137:	learn: 0.0041429	total: 18.5s	remaining: 18.2s
138:	learn: 0.0041118	total: 18.6s	remaining: 18s
139:	learn: 0.0040933	total: 18.6s	remaining: 17.8s
140:	learn: 0.0040752	total: 18.7s	remaining: 17.7s
141:	learn: 0.0040135	total: 18.8s	remaining: 17.5s
142:	learn: 0.0039482	total: 18.9s	remaining: 17.3s
143:	learn: 0.0039142	total: 19s	remaining: 17.1s
144:	learn: 0.0038917	total: 19.1s	remaining: 17s
145:	learn: 0.0038510	total: 19.2s	remaining: 16.8s
146:	learn: 0.0038135	total: 19.3s	remaining: 16.6s
147:	learn: 0.0037805	total: 19.4s	remaining: 16.5s
148:	learn: 0.0037437	total: 19.5s	remaining: 16.3s
149:	learn: 0.0037073	total: 19.5s	remaining: 16.2s
150:	learn: 0.0037074	total: 19.6s	remaining: 16s
151:	learn: 0.0036537	total: 19.7s	remaining: 15.8s
152:	learn: 0.0036452	total: 19.8s	remaining: 15.7s
153:	learn: 0.0036004	total: 19.9s	remaining: 15.5s
154:	learn: 0.0035532	total: 20s	remaining: 15.3s
155:	learn: 0.0035532	total: 20s	remaining: 15.2s
156:	learn: 0.0035085	total: 20.2s	remaining: 15s
157:	learn: 0.0035036	total: 20.2s	remaining: 14.9s
158:	learn: 0.0035036	total: 20.3s	remaining: 14.7s
159:	learn: 0.0035036	total: 20.4s	remaining: 14.5s
160:	learn: 0.0034898	total: 20.5s	remaining: 14.4s
161:	learn: 0.0034512	total: 20.6s	remaining: 14.3s
162:	learn: 0.0034379	total: 20.7s	remaining: 14.1s
163:	learn: 0.0034053	total: 20.8s	remaining: 13.9s
164:	learn: 0.0033746	total: 20.9s	remaining: 13.8s
165:	learn: 0.0033494	total: 21s	remaining: 13.7s
166:	learn: 0.0032975	total: 21.1s	remaining: 13.5s
167:	learn: 0.0032974	total: 21.2s	remaining: 13.4s
168:	learn: 0.0032844	total: 21.4s	remaining: 13.3s
169:	learn: 0.0032491	total: 21.5s	remaining: 13.2s
170:	learn: 0.0032148	total: 21.7s	remaining: 13.1s
171:	learn: 0.0031691	total: 21.9s	remaining: 13s
172:	learn: 0.0031420	total: 22.1s	remaining: 12.9s
173:	learn: 0.0031420	total: 22.3s	remaining: 12.8s
174:	learn: 0.0031420	total: 22.4s	remaining: 12.7s
175:	learn: 0.0031420	total: 22.6s	remaining: 12.6s
176:	learn: 0.0031420	total: 22.7s	remaining: 12.4s
177:	learn: 0.0031420	total: 22.9s	remaining: 12.3s
178:	learn: 0.0031420	total: 23s	remaining: 12.2s
179:	learn: 0.0031420	total: 23.1s	remaining: 12.1s
180:	learn: 0.0031371	total: 23.3s	remaining: 12s
181:	learn: 0.0031371	total: 23.4s	remaining: 11.8s
182:	learn: 0.0031371	total: 23.5s	remaining: 11.7s
183:	learn: 0.0031371	total: 23.7s	remaining: 11.6s
184:	learn: 0.0031371	total: 23.8s	remaining: 11.5s
185:	learn: 0.0031371	total: 23.9s	remaining: 11.3s
186:	learn: 0.0031371	total: 24.1s	remaining: 11.2s
187:	learn: 0.0031371	total: 24.2s	remaining: 11.1s
188:	learn: 0.0031371	total: 24.4s	remaining: 11s
189:	learn: 0.0031370	total: 24.5s	remaining: 10.8s
190:	learn: 0.0031370	total: 24.7s	remaining: 10.7s
191:	learn: 0.0031370	total: 24.8s	remaining: 10.6s
192:	learn: 0.0031370	total: 24.9s	remaining: 10.5s
193:	learn: 0.0031370	total: 25.1s	remaining: 10.3s
194:	learn: 0.0031370	total: 25.2s	remaining: 10.2s
195:	learn: 0.0031370	total: 25.3s	remaining: 10.1s
196:	learn: 0.0031370	total: 25.5s	remaining: 9.96s
197:	learn: 0.0031370	total: 25.6s	remaining: 9.84s
198:	learn: 0.0031370	total: 25.8s	remaining: 9.71s
199:	learn: 0.0031370	total: 25.9s	remaining: 9.59s
200:	learn: 0.0031370	total: 26s	remaining: 9.45s
201:	learn: 0.0031369	total: 26.2s	remaining: 9.32s
202:	learn: 0.0031369	total: 26.3s	remaining: 9.2s
203:	learn: 0.0031369	total: 26.4s	remaining: 9.07s
204:	learn: 0.0031369	total: 26.6s	remaining: 8.94s
205:	learn: 0.0031369	total: 26.7s	remaining: 8.81s
206:	learn: 0.0031369	total: 26.8s	remaining: 8.68s
207:	learn: 0.0031369	total: 26.9s	remaining: 8.55s
208:	learn: 0.0031369	total: 27.1s	remaining: 8.42s
209:	learn: 0.0031369	total: 27.2s	remaining: 8.29s
210:	learn: 0.0031369	total: 27.3s	remaining: 8.16s
211:	learn: 0.0031369	total: 27.5s	remaining: 8.04s
212:	learn: 0.0031369	total: 27.6s	remaining: 7.9s
213:	learn: 0.0031369	total: 27.7s	remaining: 7.76s
214:	learn: 0.0031369	total: 27.8s	remaining: 7.62s
215:	learn: 0.0031369	total: 27.8s	remaining: 7.47s
216:	learn: 0.0031369	total: 27.9s	remaining: 7.33s
217:	learn: 0.0031369	total: 28s	remaining: 7.18s
218:	learn: 0.0031369	total: 28.1s	remaining: 7.05s
219:	learn: 0.0031369	total: 28.2s	remaining: 6.91s
220:	learn: 0.0031369	total: 28.2s	remaining: 6.77s
221:	learn: 0.0031369	total: 28.3s	remaining: 6.62s
222:	learn: 0.0031369	total: 28.4s	remaining: 6.49s
223:	learn: 0.0031369	total: 28.4s	remaining: 6.35s
224:	learn: 0.0031369	total: 28.5s	remaining: 6.21s
225:	learn: 0.0031369	total: 28.6s	remaining: 6.07s
226:	learn: 0.0031369	total: 28.7s	remaining: 5.93s
227:	learn: 0.0031369	total: 28.7s	remaining: 5.79s
228:	learn: 0.0031369	total: 28.8s	remaining: 5.66s
229:	learn: 0.0031369	total: 28.9s	remaining: 5.52s
230:	learn: 0.0031369	total: 28.9s	remaining: 5.39s
231:	learn: 0.0031369	total: 29s	remaining: 5.25s
232:	learn: 0.0031369	total: 29.1s	remaining: 5.12s
233:	learn: 0.0031369	total: 29.2s	remaining: 4.98s
234:	learn: 0.0031369	total: 29.2s	remaining: 4.85s
235:	learn: 0.0031369	total: 29.3s	remaining: 4.72s
236:	learn: 0.0031369	total: 29.4s	remaining: 4.58s
237:	learn: 0.0031369	total: 29.4s	remaining: 4.45s
238:	learn: 0.0031369	total: 29.5s	remaining: 4.32s
239:	learn: 0.0031369	total: 29.6s	remaining: 4.19s
240:	learn: 0.0031369	total: 29.7s	remaining: 4.06s
241:	learn: 0.0031369	total: 29.8s	remaining: 3.94s
242:	learn: 0.0031369	total: 29.8s	remaining: 3.8s
243:	learn: 0.0031369	total: 29.9s	remaining: 3.67s
244:	learn: 0.0031369	total: 30s	remaining: 3.55s
245:	learn: 0.0031369	total: 30s	remaining: 3.42s
246:	learn: 0.0031369	total: 30.1s	remaining: 3.29s
247:	learn: 0.0031369	total: 30.2s	remaining: 3.16s
248:	learn: 0.0031369	total: 30.3s	remaining: 3.04s
249:	learn: 0.0031369	total: 30.3s	remaining: 2.91s
250:	learn: 0.0031369	total: 30.4s	remaining: 2.78s
251:	learn: 0.0031369	total: 30.4s	remaining: 2.66s
252:	learn: 0.0031369	total: 30.6s	remaining: 2.54s
253:	learn: 0.0031369	total: 30.6s	remaining: 2.41s
254:	learn: 0.0031369	total: 30.7s	remaining: 2.29s
255:	learn: 0.0031369	total: 30.8s	remaining: 2.16s
256:	learn: 0.0031369	total: 30.9s	remaining: 2.04s
257:	learn: 0.0031369	total: 30.9s	remaining: 1.92s
258:	learn: 0.0031369	total: 31s	remaining: 1.79s
259:	learn: 0.0031369	total: 31.1s	remaining: 1.67s
260:	learn: 0.0031369	total: 31.1s	remaining: 1.55s
261:	learn: 0.0031369	total: 31.2s	remaining: 1.43s
262:	learn: 0.0031369	total: 31.3s	remaining: 1.31s
263:	learn: 0.0031369	total: 31.4s	remaining: 1.19s
264:	learn: 0.0031369	total: 31.4s	remaining: 1.07s
265:	learn: 0.0031369	total: 31.5s	remaining: 947ms
266:	learn: 0.0031369	total: 31.5s	remaining: 827ms
267:	learn: 0.0031369	total: 31.6s	remaining: 708ms
268:	learn: 0.0031369	total: 31.7s	remaining: 589ms
269:	learn: 0.0031369	total: 31.8s	remaining: 471ms
270:	learn: 0.0031369	total: 31.8s	remaining: 352ms
271:	learn: 0.0031369	total: 32s	remaining: 235ms
272:	learn: 0.0031369	total: 32s	remaining: 117ms
273:	learn: 0.0031369	total: 32.1s	remaining: 0us
[I 2024-12-19 15:02:20,999] Trial 41 finished with value: 78.3979405034687 and parameters: {'learning_rate': 0.08686086644631785, 'max_depth': 6, 'n_estimators': 274, 'scale_pos_weight': 6.11935856633133}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.68
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.84
 - F1-Score_Train: 99.84
 - Precision_Test: 28.08
 - Recall_Test: 84.92
 - AUPRC_Test: 78.19
 - Accuracy_Test: 99.61
 - F1-Score_Test: 42.21
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 274
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.12
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 78.3979

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5001779	total: 87ms	remaining: 25.1s
1:	learn: 0.3353381	total: 183ms	remaining: 26.2s
2:	learn: 0.2436284	total: 273ms	remaining: 26s
3:	learn: 0.1841819	total: 389ms	remaining: 27.7s
4:	learn: 0.1413997	total: 498ms	remaining: 28.3s
5:	learn: 0.1107485	total: 584ms	remaining: 27.6s
6:	learn: 0.0912669	total: 697ms	remaining: 28.1s
7:	learn: 0.0797356	total: 790ms	remaining: 27.7s
8:	learn: 0.0699398	total: 889ms	remaining: 27.7s
9:	learn: 0.0621389	total: 1s	remaining: 28.1s
10:	learn: 0.0559669	total: 1.11s	remaining: 28s
11:	learn: 0.0523080	total: 1.2s	remaining: 27.7s
12:	learn: 0.0501771	total: 1.31s	remaining: 27.9s
13:	learn: 0.0475215	total: 1.41s	remaining: 27.7s
14:	learn: 0.0448076	total: 1.51s	remaining: 27.6s
15:	learn: 0.0423056	total: 1.63s	remaining: 27.8s
16:	learn: 0.0396052	total: 1.71s	remaining: 27.4s
17:	learn: 0.0372614	total: 1.82s	remaining: 27.3s
18:	learn: 0.0361946	total: 1.92s	remaining: 27.4s
19:	learn: 0.0347864	total: 2.01s	remaining: 27.1s
20:	learn: 0.0333803	total: 2.1s	remaining: 26.8s
21:	learn: 0.0320671	total: 2.21s	remaining: 26.9s
22:	learn: 0.0302366	total: 2.31s	remaining: 26.7s
23:	learn: 0.0289438	total: 2.4s	remaining: 26.5s
24:	learn: 0.0274774	total: 2.53s	remaining: 26.8s
25:	learn: 0.0267031	total: 2.62s	remaining: 26.5s
26:	learn: 0.0255107	total: 2.71s	remaining: 26.3s
27:	learn: 0.0247263	total: 2.82s	remaining: 26.3s
28:	learn: 0.0239499	total: 2.92s	remaining: 26.2s
29:	learn: 0.0229677	total: 3.01s	remaining: 26s
30:	learn: 0.0222512	total: 3.12s	remaining: 26s
31:	learn: 0.0215148	total: 3.21s	remaining: 25.8s
32:	learn: 0.0208384	total: 3.36s	remaining: 26s
33:	learn: 0.0200070	total: 3.55s	remaining: 26.7s
34:	learn: 0.0193941	total: 3.73s	remaining: 27.1s
35:	learn: 0.0190091	total: 3.9s	remaining: 27.4s
36:	learn: 0.0182314	total: 4.09s	remaining: 27.8s
37:	learn: 0.0177328	total: 4.27s	remaining: 28.2s
38:	learn: 0.0172050	total: 4.45s	remaining: 28.5s
39:	learn: 0.0167586	total: 4.62s	remaining: 28.7s
40:	learn: 0.0164384	total: 4.79s	remaining: 29s
41:	learn: 0.0159128	total: 4.97s	remaining: 29.2s
42:	learn: 0.0154854	total: 5.14s	remaining: 29.4s
43:	learn: 0.0150102	total: 5.3s	remaining: 29.5s
44:	learn: 0.0145972	total: 5.49s	remaining: 29.8s
45:	learn: 0.0141736	total: 5.67s	remaining: 29.9s
46:	learn: 0.0139472	total: 5.85s	remaining: 30.1s
47:	learn: 0.0135909	total: 6.02s	remaining: 30.2s
48:	learn: 0.0133159	total: 6.22s	remaining: 30.5s
49:	learn: 0.0130078	total: 6.39s	remaining: 30.6s
50:	learn: 0.0127080	total: 6.57s	remaining: 30.7s
51:	learn: 0.0123365	total: 6.77s	remaining: 30.8s
52:	learn: 0.0120423	total: 6.96s	remaining: 31s
53:	learn: 0.0117854	total: 7.1s	remaining: 30.9s
54:	learn: 0.0115174	total: 7.26s	remaining: 30.9s
55:	learn: 0.0112780	total: 7.43s	remaining: 30.9s
56:	learn: 0.0109450	total: 7.58s	remaining: 30.9s
57:	learn: 0.0105291	total: 7.77s	remaining: 30.9s
58:	learn: 0.0103968	total: 7.95s	remaining: 31s
59:	learn: 0.0100824	total: 8.12s	remaining: 31s
60:	learn: 0.0099500	total: 8.31s	remaining: 31.1s
61:	learn: 0.0097439	total: 8.46s	remaining: 31s
62:	learn: 0.0095753	total: 8.67s	remaining: 31.1s
63:	learn: 0.0093678	total: 8.84s	remaining: 31.1s
64:	learn: 0.0091733	total: 9.03s	remaining: 31.1s
65:	learn: 0.0090538	total: 9.19s	remaining: 31s
66:	learn: 0.0088058	total: 9.36s	remaining: 31s
67:	learn: 0.0087107	total: 9.45s	remaining: 30.7s
68:	learn: 0.0085740	total: 9.54s	remaining: 30.4s
69:	learn: 0.0084647	total: 9.65s	remaining: 30.2s
70:	learn: 0.0083250	total: 9.74s	remaining: 29.9s
71:	learn: 0.0081639	total: 9.82s	remaining: 29.6s
72:	learn: 0.0080596	total: 9.96s	remaining: 29.5s
73:	learn: 0.0079150	total: 10s	remaining: 29.2s
74:	learn: 0.0077055	total: 10.1s	remaining: 28.9s
75:	learn: 0.0075462	total: 10.3s	remaining: 28.7s
76:	learn: 0.0074506	total: 10.4s	remaining: 28.5s
77:	learn: 0.0073740	total: 10.4s	remaining: 28.2s
78:	learn: 0.0072711	total: 10.6s	remaining: 28s
79:	learn: 0.0071444	total: 10.6s	remaining: 27.8s
80:	learn: 0.0070733	total: 10.7s	remaining: 27.5s
81:	learn: 0.0069618	total: 10.8s	remaining: 27.3s
82:	learn: 0.0068780	total: 10.9s	remaining: 27.1s
83:	learn: 0.0068152	total: 11s	remaining: 26.9s
84:	learn: 0.0066711	total: 11.1s	remaining: 26.7s
85:	learn: 0.0065937	total: 11.2s	remaining: 26.5s
86:	learn: 0.0064604	total: 11.3s	remaining: 26.3s
87:	learn: 0.0063482	total: 11.4s	remaining: 26.1s
88:	learn: 0.0062261	total: 11.5s	remaining: 25.9s
89:	learn: 0.0060671	total: 11.6s	remaining: 25.7s
90:	learn: 0.0059533	total: 11.7s	remaining: 25.5s
91:	learn: 0.0058203	total: 11.8s	remaining: 25.3s
92:	learn: 0.0057325	total: 11.9s	remaining: 25.1s
93:	learn: 0.0056767	total: 12s	remaining: 24.9s
94:	learn: 0.0055787	total: 12.1s	remaining: 24.8s
95:	learn: 0.0054563	total: 12.2s	remaining: 24.6s
96:	learn: 0.0054095	total: 12.3s	remaining: 24.4s
97:	learn: 0.0053673	total: 12.4s	remaining: 24.2s
98:	learn: 0.0053035	total: 12.5s	remaining: 24s
99:	learn: 0.0052123	total: 12.6s	remaining: 23.8s
100:	learn: 0.0051397	total: 12.7s	remaining: 23.6s
101:	learn: 0.0050615	total: 12.8s	remaining: 23.4s
102:	learn: 0.0049817	total: 12.9s	remaining: 23.3s
103:	learn: 0.0049345	total: 13s	remaining: 23.1s
104:	learn: 0.0048810	total: 13.1s	remaining: 22.9s
105:	learn: 0.0047411	total: 13.2s	remaining: 22.8s
106:	learn: 0.0046450	total: 13.3s	remaining: 22.6s
107:	learn: 0.0045859	total: 13.4s	remaining: 22.4s
108:	learn: 0.0045079	total: 13.5s	remaining: 22.3s
109:	learn: 0.0044429	total: 13.6s	remaining: 22.2s
110:	learn: 0.0043836	total: 13.7s	remaining: 22s
111:	learn: 0.0043395	total: 13.8s	remaining: 21.8s
112:	learn: 0.0043249	total: 13.9s	remaining: 21.7s
113:	learn: 0.0042665	total: 14s	remaining: 21.5s
114:	learn: 0.0042126	total: 14.1s	remaining: 21.4s
115:	learn: 0.0041914	total: 14.2s	remaining: 21.2s
116:	learn: 0.0041131	total: 14.3s	remaining: 21s
117:	learn: 0.0040702	total: 14.4s	remaining: 20.9s
118:	learn: 0.0039913	total: 14.5s	remaining: 20.7s
119:	learn: 0.0039519	total: 14.6s	remaining: 20.5s
120:	learn: 0.0039519	total: 14.7s	remaining: 20.4s
121:	learn: 0.0039055	total: 14.8s	remaining: 20.2s
122:	learn: 0.0038936	total: 14.8s	remaining: 20s
123:	learn: 0.0038520	total: 14.9s	remaining: 19.9s
124:	learn: 0.0038218	total: 15s	remaining: 19.7s
125:	learn: 0.0037331	total: 15.1s	remaining: 19.6s
126:	learn: 0.0037115	total: 15.2s	remaining: 19.4s
127:	learn: 0.0036384	total: 15.3s	remaining: 19.3s
128:	learn: 0.0036123	total: 15.4s	remaining: 19.1s
129:	learn: 0.0035912	total: 15.5s	remaining: 19s
130:	learn: 0.0035791	total: 15.6s	remaining: 18.8s
131:	learn: 0.0035447	total: 15.7s	remaining: 18.7s
132:	learn: 0.0035072	total: 15.8s	remaining: 18.5s
133:	learn: 0.0034803	total: 15.9s	remaining: 18.4s
134:	learn: 0.0034494	total: 16s	remaining: 18.2s
135:	learn: 0.0034429	total: 16.1s	remaining: 18.1s
136:	learn: 0.0034066	total: 16.2s	remaining: 17.9s
137:	learn: 0.0033800	total: 16.2s	remaining: 17.8s
138:	learn: 0.0033800	total: 16.3s	remaining: 17.6s
139:	learn: 0.0033559	total: 16.4s	remaining: 17.5s
140:	learn: 0.0033267	total: 16.5s	remaining: 17.3s
141:	learn: 0.0033267	total: 16.6s	remaining: 17.2s
142:	learn: 0.0033267	total: 16.7s	remaining: 17s
143:	learn: 0.0033267	total: 16.7s	remaining: 16.9s
144:	learn: 0.0033267	total: 16.8s	remaining: 16.7s
145:	learn: 0.0033090	total: 16.9s	remaining: 16.6s
146:	learn: 0.0032828	total: 17s	remaining: 16.4s
147:	learn: 0.0032660	total: 17.1s	remaining: 16.3s
148:	learn: 0.0032660	total: 17.2s	remaining: 16.2s
149:	learn: 0.0032262	total: 17.3s	remaining: 16s
150:	learn: 0.0032262	total: 17.4s	remaining: 15.9s
151:	learn: 0.0031936	total: 17.5s	remaining: 15.7s
152:	learn: 0.0031691	total: 17.5s	remaining: 15.6s
153:	learn: 0.0031691	total: 17.6s	remaining: 15.5s
154:	learn: 0.0031614	total: 17.7s	remaining: 15.3s
155:	learn: 0.0031614	total: 17.8s	remaining: 15.2s
156:	learn: 0.0031034	total: 17.9s	remaining: 15.1s
157:	learn: 0.0030714	total: 18s	remaining: 14.9s
158:	learn: 0.0030713	total: 18.1s	remaining: 14.8s
159:	learn: 0.0030713	total: 18.2s	remaining: 14.7s
160:	learn: 0.0030713	total: 18.3s	remaining: 14.5s
161:	learn: 0.0030713	total: 18.3s	remaining: 14.4s
162:	learn: 0.0030163	total: 18.5s	remaining: 14.3s
163:	learn: 0.0030006	total: 18.5s	remaining: 14.1s
164:	learn: 0.0029670	total: 18.6s	remaining: 14s
165:	learn: 0.0029442	total: 18.7s	remaining: 13.9s
166:	learn: 0.0029442	total: 18.8s	remaining: 13.7s
167:	learn: 0.0029442	total: 18.9s	remaining: 13.6s
168:	learn: 0.0029442	total: 18.9s	remaining: 13.5s
169:	learn: 0.0029145	total: 19.1s	remaining: 13.3s
170:	learn: 0.0029145	total: 19.1s	remaining: 13.2s
171:	learn: 0.0029145	total: 19.2s	remaining: 13.1s
172:	learn: 0.0029145	total: 19.3s	remaining: 13s
173:	learn: 0.0029145	total: 19.5s	remaining: 12.9s
174:	learn: 0.0029144	total: 19.6s	remaining: 12.8s
175:	learn: 0.0029144	total: 19.7s	remaining: 12.7s
176:	learn: 0.0029144	total: 19.9s	remaining: 12.6s
177:	learn: 0.0029144	total: 20s	remaining: 12.5s
178:	learn: 0.0029144	total: 20.1s	remaining: 12.4s
179:	learn: 0.0029144	total: 20.3s	remaining: 12.3s
180:	learn: 0.0029144	total: 20.4s	remaining: 12.2s
181:	learn: 0.0029144	total: 20.6s	remaining: 12.1s
182:	learn: 0.0029144	total: 20.7s	remaining: 12s
183:	learn: 0.0029144	total: 20.8s	remaining: 11.9s
184:	learn: 0.0029143	total: 21s	remaining: 11.8s
185:	learn: 0.0029143	total: 21.1s	remaining: 11.7s
186:	learn: 0.0029143	total: 21.2s	remaining: 11.6s
187:	learn: 0.0029143	total: 21.4s	remaining: 11.5s
188:	learn: 0.0029143	total: 21.5s	remaining: 11.4s
189:	learn: 0.0029143	total: 21.7s	remaining: 11.3s
190:	learn: 0.0029108	total: 21.8s	remaining: 11.2s
191:	learn: 0.0028769	total: 22s	remaining: 11.1s
192:	learn: 0.0028769	total: 22.1s	remaining: 11s
193:	learn: 0.0028696	total: 22.3s	remaining: 10.9s
194:	learn: 0.0028388	total: 22.4s	remaining: 10.8s
195:	learn: 0.0028169	total: 22.6s	remaining: 10.7s
196:	learn: 0.0028169	total: 22.8s	remaining: 10.6s
197:	learn: 0.0027968	total: 22.9s	remaining: 10.5s
198:	learn: 0.0027968	total: 23.1s	remaining: 10.4s
199:	learn: 0.0027613	total: 23.3s	remaining: 10.3s
200:	learn: 0.0027613	total: 23.4s	remaining: 10.3s
201:	learn: 0.0027613	total: 23.6s	remaining: 10.1s
202:	learn: 0.0027613	total: 23.7s	remaining: 10s
203:	learn: 0.0027488	total: 23.8s	remaining: 9.94s
204:	learn: 0.0027488	total: 24s	remaining: 9.83s
205:	learn: 0.0027489	total: 24.1s	remaining: 9.73s
206:	learn: 0.0027489	total: 24.3s	remaining: 9.62s
207:	learn: 0.0027488	total: 24.4s	remaining: 9.52s
208:	learn: 0.0027488	total: 24.6s	remaining: 9.41s
209:	learn: 0.0027488	total: 24.7s	remaining: 9.3s
210:	learn: 0.0027489	total: 24.9s	remaining: 9.19s
211:	learn: 0.0027489	total: 25s	remaining: 9.09s
212:	learn: 0.0027320	total: 25.2s	remaining: 8.98s
213:	learn: 0.0027320	total: 25.4s	remaining: 8.89s
214:	learn: 0.0027320	total: 25.4s	remaining: 8.76s
215:	learn: 0.0027320	total: 25.5s	remaining: 8.63s
216:	learn: 0.0027320	total: 25.6s	remaining: 8.5s
217:	learn: 0.0027320	total: 25.7s	remaining: 8.38s
218:	learn: 0.0027320	total: 25.8s	remaining: 8.24s
219:	learn: 0.0027320	total: 25.9s	remaining: 8.12s
220:	learn: 0.0027320	total: 26s	remaining: 7.99s
221:	learn: 0.0027320	total: 26s	remaining: 7.86s
222:	learn: 0.0027320	total: 26.1s	remaining: 7.73s
223:	learn: 0.0027320	total: 26.2s	remaining: 7.61s
224:	learn: 0.0027320	total: 26.3s	remaining: 7.48s
225:	learn: 0.0027320	total: 26.4s	remaining: 7.35s
226:	learn: 0.0027320	total: 26.4s	remaining: 7.22s
227:	learn: 0.0027320	total: 26.5s	remaining: 7.09s
228:	learn: 0.0027320	total: 26.6s	remaining: 6.97s
229:	learn: 0.0027320	total: 26.7s	remaining: 6.85s
230:	learn: 0.0027320	total: 26.8s	remaining: 6.72s
231:	learn: 0.0027320	total: 26.9s	remaining: 6.61s
232:	learn: 0.0027320	total: 27s	remaining: 6.48s
233:	learn: 0.0027320	total: 27s	remaining: 6.35s
234:	learn: 0.0027320	total: 27.1s	remaining: 6.23s
235:	learn: 0.0027320	total: 27.2s	remaining: 6.11s
236:	learn: 0.0027320	total: 27.3s	remaining: 5.99s
237:	learn: 0.0027320	total: 27.4s	remaining: 5.87s
238:	learn: 0.0027320	total: 27.5s	remaining: 5.74s
239:	learn: 0.0027320	total: 27.5s	remaining: 5.62s
240:	learn: 0.0027320	total: 27.6s	remaining: 5.5s
241:	learn: 0.0027320	total: 27.7s	remaining: 5.38s
242:	learn: 0.0027320	total: 27.8s	remaining: 5.26s
243:	learn: 0.0027320	total: 27.9s	remaining: 5.14s
244:	learn: 0.0027320	total: 28s	remaining: 5.02s
245:	learn: 0.0027320	total: 28s	remaining: 4.9s
246:	learn: 0.0027320	total: 28.1s	remaining: 4.78s
247:	learn: 0.0027320	total: 28.2s	remaining: 4.66s
248:	learn: 0.0027320	total: 28.3s	remaining: 4.54s
249:	learn: 0.0027320	total: 28.4s	remaining: 4.42s
250:	learn: 0.0027320	total: 28.4s	remaining: 4.3s
251:	learn: 0.0027320	total: 28.5s	remaining: 4.19s
252:	learn: 0.0027320	total: 28.6s	remaining: 4.07s
253:	learn: 0.0027320	total: 28.7s	remaining: 3.95s
254:	learn: 0.0027320	total: 28.8s	remaining: 3.83s
255:	learn: 0.0027319	total: 28.9s	remaining: 3.72s
256:	learn: 0.0027319	total: 28.9s	remaining: 3.6s
257:	learn: 0.0027319	total: 29s	remaining: 3.48s
258:	learn: 0.0027320	total: 29.1s	remaining: 3.37s
259:	learn: 0.0027320	total: 29.2s	remaining: 3.25s
260:	learn: 0.0027320	total: 29.2s	remaining: 3.14s
261:	learn: 0.0027320	total: 29.4s	remaining: 3.02s
262:	learn: 0.0027320	total: 29.4s	remaining: 2.91s
263:	learn: 0.0027319	total: 29.5s	remaining: 2.79s
264:	learn: 0.0027320	total: 29.6s	remaining: 2.68s
265:	learn: 0.0027319	total: 29.7s	remaining: 2.56s
266:	learn: 0.0027319	total: 29.7s	remaining: 2.45s
267:	learn: 0.0027319	total: 29.8s	remaining: 2.34s
268:	learn: 0.0027319	total: 29.9s	remaining: 2.22s
269:	learn: 0.0027319	total: 30s	remaining: 2.11s
270:	learn: 0.0027319	total: 30.1s	remaining: 2s
271:	learn: 0.0027319	total: 30.2s	remaining: 1.89s
272:	learn: 0.0027319	total: 30.3s	remaining: 1.77s
273:	learn: 0.0027319	total: 30.4s	remaining: 1.66s
274:	learn: 0.0027319	total: 30.5s	remaining: 1.55s
275:	learn: 0.0027319	total: 30.5s	remaining: 1.44s
276:	learn: 0.0027319	total: 30.6s	remaining: 1.33s
277:	learn: 0.0027319	total: 30.7s	remaining: 1.22s
278:	learn: 0.0027319	total: 30.8s	remaining: 1.1s
279:	learn: 0.0027319	total: 30.9s	remaining: 993ms
280:	learn: 0.0027319	total: 31s	remaining: 882ms
281:	learn: 0.0027319	total: 31.1s	remaining: 771ms
282:	learn: 0.0027319	total: 31.2s	remaining: 661ms
283:	learn: 0.0027319	total: 31.2s	remaining: 550ms
284:	learn: 0.0027319	total: 31.3s	remaining: 440ms
285:	learn: 0.0027319	total: 31.4s	remaining: 330ms
286:	learn: 0.0027318	total: 31.5s	remaining: 219ms
287:	learn: 0.0027318	total: 31.6s	remaining: 110ms
288:	learn: 0.0027318	total: 31.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.72
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 28.23
 - Recall_Test: 84.92
 - AUPRC_Test: 80.09
 - Accuracy_Test: 99.61
 - F1-Score_Test: 42.38
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 289
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.10
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4953542	total: 96.5ms	remaining: 27.8s
1:	learn: 0.3580830	total: 192ms	remaining: 27.5s
2:	learn: 0.2677059	total: 281ms	remaining: 26.8s
3:	learn: 0.2162799	total: 399ms	remaining: 28.4s
4:	learn: 0.1809863	total: 487ms	remaining: 27.7s
5:	learn: 0.1551737	total: 568ms	remaining: 26.8s
6:	learn: 0.1376417	total: 678ms	remaining: 27.3s
7:	learn: 0.1207130	total: 772ms	remaining: 27.1s
8:	learn: 0.1080863	total: 859ms	remaining: 26.7s
9:	learn: 0.0971131	total: 986ms	remaining: 27.5s
10:	learn: 0.0906598	total: 1.09s	remaining: 27.6s
11:	learn: 0.0847219	total: 1.18s	remaining: 27.1s
12:	learn: 0.0770649	total: 1.3s	remaining: 27.7s
13:	learn: 0.0719016	total: 1.4s	remaining: 27.5s
14:	learn: 0.0680401	total: 1.49s	remaining: 27.2s
15:	learn: 0.0646251	total: 1.65s	remaining: 28.2s
16:	learn: 0.0620424	total: 1.81s	remaining: 29s
17:	learn: 0.0593656	total: 1.98s	remaining: 29.8s
18:	learn: 0.0556893	total: 2.16s	remaining: 30.7s
19:	learn: 0.0526975	total: 2.35s	remaining: 31.5s
20:	learn: 0.0493223	total: 2.51s	remaining: 32s
21:	learn: 0.0461217	total: 2.7s	remaining: 32.8s
22:	learn: 0.0441021	total: 2.85s	remaining: 32.9s
23:	learn: 0.0423149	total: 3.03s	remaining: 33.4s
24:	learn: 0.0408023	total: 3.21s	remaining: 33.9s
25:	learn: 0.0395602	total: 3.4s	remaining: 34.4s
26:	learn: 0.0382142	total: 3.57s	remaining: 34.6s
27:	learn: 0.0365208	total: 3.74s	remaining: 34.8s
28:	learn: 0.0348550	total: 3.89s	remaining: 34.9s
29:	learn: 0.0339358	total: 4.07s	remaining: 35.1s
30:	learn: 0.0327595	total: 4.26s	remaining: 35.4s
31:	learn: 0.0316522	total: 4.43s	remaining: 35.6s
32:	learn: 0.0304815	total: 4.6s	remaining: 35.7s
33:	learn: 0.0294580	total: 4.78s	remaining: 35.9s
34:	learn: 0.0288142	total: 4.96s	remaining: 36s
35:	learn: 0.0280431	total: 5.15s	remaining: 36.2s
36:	learn: 0.0270874	total: 5.33s	remaining: 36.3s
37:	learn: 0.0263410	total: 5.48s	remaining: 36.2s
38:	learn: 0.0257054	total: 5.64s	remaining: 36.2s
39:	learn: 0.0248976	total: 5.81s	remaining: 36.2s
40:	learn: 0.0243344	total: 5.97s	remaining: 36.1s
41:	learn: 0.0239077	total: 6.15s	remaining: 36.2s
42:	learn: 0.0234451	total: 6.33s	remaining: 36.2s
43:	learn: 0.0226860	total: 6.49s	remaining: 36.1s
44:	learn: 0.0220199	total: 6.66s	remaining: 36.1s
45:	learn: 0.0212360	total: 6.84s	remaining: 36.2s
46:	learn: 0.0208839	total: 7.01s	remaining: 36.1s
47:	learn: 0.0202774	total: 7.2s	remaining: 36.1s
48:	learn: 0.0198368	total: 7.37s	remaining: 36.1s
49:	learn: 0.0192970	total: 7.54s	remaining: 36.1s
50:	learn: 0.0186300	total: 7.67s	remaining: 35.8s
51:	learn: 0.0180425	total: 7.77s	remaining: 35.4s
52:	learn: 0.0176843	total: 7.91s	remaining: 35.2s
53:	learn: 0.0174555	total: 8s	remaining: 34.8s
54:	learn: 0.0171164	total: 8.1s	remaining: 34.5s
55:	learn: 0.0167607	total: 8.18s	remaining: 34s
56:	learn: 0.0162447	total: 8.3s	remaining: 33.8s
57:	learn: 0.0158902	total: 8.39s	remaining: 33.4s
58:	learn: 0.0154448	total: 8.5s	remaining: 33.2s
59:	learn: 0.0151963	total: 8.61s	remaining: 32.9s
60:	learn: 0.0149423	total: 8.69s	remaining: 32.5s
61:	learn: 0.0145878	total: 8.81s	remaining: 32.3s
62:	learn: 0.0142949	total: 8.89s	remaining: 31.9s
63:	learn: 0.0138791	total: 8.99s	remaining: 31.6s
64:	learn: 0.0136166	total: 9.11s	remaining: 31.4s
65:	learn: 0.0134501	total: 9.19s	remaining: 31.1s
66:	learn: 0.0132252	total: 9.28s	remaining: 30.8s
67:	learn: 0.0129041	total: 9.4s	remaining: 30.6s
68:	learn: 0.0126367	total: 9.51s	remaining: 30.3s
69:	learn: 0.0125283	total: 9.58s	remaining: 30s
70:	learn: 0.0123673	total: 9.69s	remaining: 29.8s
71:	learn: 0.0120363	total: 9.78s	remaining: 29.5s
72:	learn: 0.0118405	total: 9.86s	remaining: 29.2s
73:	learn: 0.0115279	total: 9.98s	remaining: 29s
74:	learn: 0.0112656	total: 10.1s	remaining: 28.7s
75:	learn: 0.0111406	total: 10.2s	remaining: 28.5s
76:	learn: 0.0109546	total: 10.3s	remaining: 28.3s
77:	learn: 0.0108382	total: 10.4s	remaining: 28s
78:	learn: 0.0107212	total: 10.5s	remaining: 27.8s
79:	learn: 0.0104893	total: 10.6s	remaining: 27.6s
80:	learn: 0.0102982	total: 10.7s	remaining: 27.4s
81:	learn: 0.0100193	total: 10.8s	remaining: 27.1s
82:	learn: 0.0098412	total: 10.9s	remaining: 27s
83:	learn: 0.0096844	total: 11s	remaining: 26.7s
84:	learn: 0.0094742	total: 11s	remaining: 26.5s
85:	learn: 0.0093618	total: 11.2s	remaining: 26.3s
86:	learn: 0.0092359	total: 11.2s	remaining: 26.1s
87:	learn: 0.0091738	total: 11.3s	remaining: 25.9s
88:	learn: 0.0091088	total: 11.4s	remaining: 25.7s
89:	learn: 0.0089802	total: 11.5s	remaining: 25.5s
90:	learn: 0.0088448	total: 11.6s	remaining: 25.3s
91:	learn: 0.0087250	total: 11.7s	remaining: 25.1s
92:	learn: 0.0086349	total: 11.8s	remaining: 24.9s
93:	learn: 0.0085505	total: 11.9s	remaining: 24.7s
94:	learn: 0.0084195	total: 12s	remaining: 24.5s
95:	learn: 0.0082841	total: 12.1s	remaining: 24.3s
96:	learn: 0.0081435	total: 12.2s	remaining: 24.1s
97:	learn: 0.0080347	total: 12.3s	remaining: 24.1s
98:	learn: 0.0079287	total: 12.4s	remaining: 23.9s
99:	learn: 0.0078065	total: 12.5s	remaining: 23.7s
100:	learn: 0.0077043	total: 12.6s	remaining: 23.5s
101:	learn: 0.0076220	total: 12.7s	remaining: 23.3s
102:	learn: 0.0074823	total: 12.8s	remaining: 23.1s
103:	learn: 0.0074193	total: 12.9s	remaining: 23s
104:	learn: 0.0072756	total: 13s	remaining: 22.8s
105:	learn: 0.0072084	total: 13.1s	remaining: 22.6s
106:	learn: 0.0071236	total: 13.2s	remaining: 22.5s
107:	learn: 0.0070184	total: 13.3s	remaining: 22.3s
108:	learn: 0.0068534	total: 13.4s	remaining: 22.1s
109:	learn: 0.0067547	total: 13.5s	remaining: 22s
110:	learn: 0.0066970	total: 13.6s	remaining: 21.8s
111:	learn: 0.0065979	total: 13.7s	remaining: 21.7s
112:	learn: 0.0064938	total: 13.8s	remaining: 21.5s
113:	learn: 0.0064047	total: 13.9s	remaining: 21.4s
114:	learn: 0.0062382	total: 14s	remaining: 21.2s
115:	learn: 0.0061391	total: 14.1s	remaining: 21s
116:	learn: 0.0060510	total: 14.2s	remaining: 20.9s
117:	learn: 0.0059689	total: 14.3s	remaining: 20.7s
118:	learn: 0.0058947	total: 14.4s	remaining: 20.6s
119:	learn: 0.0057080	total: 14.5s	remaining: 20.4s
120:	learn: 0.0056339	total: 14.6s	remaining: 20.3s
121:	learn: 0.0055699	total: 14.7s	remaining: 20.2s
122:	learn: 0.0055148	total: 14.8s	remaining: 20s
123:	learn: 0.0054828	total: 14.9s	remaining: 19.8s
124:	learn: 0.0054316	total: 15s	remaining: 19.7s
125:	learn: 0.0053360	total: 15.1s	remaining: 19.5s
126:	learn: 0.0052945	total: 15.2s	remaining: 19.4s
127:	learn: 0.0052457	total: 15.3s	remaining: 19.2s
128:	learn: 0.0051419	total: 15.4s	remaining: 19.1s
129:	learn: 0.0050785	total: 15.5s	remaining: 18.9s
130:	learn: 0.0050263	total: 15.6s	remaining: 18.8s
131:	learn: 0.0050091	total: 15.7s	remaining: 18.6s
132:	learn: 0.0049768	total: 15.8s	remaining: 18.5s
133:	learn: 0.0048704	total: 15.9s	remaining: 18.4s
134:	learn: 0.0048451	total: 16s	remaining: 18.2s
135:	learn: 0.0048226	total: 16s	remaining: 18s
136:	learn: 0.0047940	total: 16.1s	remaining: 17.9s
137:	learn: 0.0047423	total: 16.2s	remaining: 17.8s
138:	learn: 0.0047424	total: 16.3s	remaining: 17.6s
139:	learn: 0.0047266	total: 16.4s	remaining: 17.5s
140:	learn: 0.0046709	total: 16.5s	remaining: 17.3s
141:	learn: 0.0046111	total: 16.6s	remaining: 17.2s
142:	learn: 0.0045384	total: 16.7s	remaining: 17.1s
143:	learn: 0.0044934	total: 16.8s	remaining: 16.9s
144:	learn: 0.0044332	total: 16.9s	remaining: 16.8s
145:	learn: 0.0043911	total: 17s	remaining: 16.7s
146:	learn: 0.0043091	total: 17.1s	remaining: 16.5s
147:	learn: 0.0042834	total: 17.2s	remaining: 16.4s
148:	learn: 0.0042761	total: 17.3s	remaining: 16.3s
149:	learn: 0.0041871	total: 17.4s	remaining: 16.1s
150:	learn: 0.0041498	total: 17.5s	remaining: 16s
151:	learn: 0.0041188	total: 17.6s	remaining: 15.9s
152:	learn: 0.0040880	total: 17.7s	remaining: 15.8s
153:	learn: 0.0040011	total: 17.9s	remaining: 15.7s
154:	learn: 0.0040011	total: 18.1s	remaining: 15.6s
155:	learn: 0.0039406	total: 18.2s	remaining: 15.5s
156:	learn: 0.0039405	total: 18.4s	remaining: 15.5s
157:	learn: 0.0039094	total: 18.6s	remaining: 15.4s
158:	learn: 0.0038874	total: 18.7s	remaining: 15.3s
159:	learn: 0.0038727	total: 18.9s	remaining: 15.2s
160:	learn: 0.0038198	total: 19.1s	remaining: 15.1s
161:	learn: 0.0038118	total: 19.2s	remaining: 15.1s
162:	learn: 0.0037873	total: 19.3s	remaining: 15s
163:	learn: 0.0037712	total: 19.5s	remaining: 14.9s
164:	learn: 0.0037216	total: 19.7s	remaining: 14.8s
165:	learn: 0.0037105	total: 19.8s	remaining: 14.7s
166:	learn: 0.0036910	total: 20s	remaining: 14.6s
167:	learn: 0.0036293	total: 20.2s	remaining: 14.5s
168:	learn: 0.0035998	total: 20.4s	remaining: 14.4s
169:	learn: 0.0035645	total: 20.5s	remaining: 14.4s
170:	learn: 0.0035448	total: 20.7s	remaining: 14.3s
171:	learn: 0.0034808	total: 20.9s	remaining: 14.2s
172:	learn: 0.0034421	total: 21.1s	remaining: 14.1s
173:	learn: 0.0034204	total: 21.2s	remaining: 14s
174:	learn: 0.0033599	total: 21.4s	remaining: 13.9s
175:	learn: 0.0033229	total: 21.6s	remaining: 13.8s
176:	learn: 0.0033165	total: 21.7s	remaining: 13.7s
177:	learn: 0.0033165	total: 21.8s	remaining: 13.6s
178:	learn: 0.0033165	total: 22s	remaining: 13.5s
179:	learn: 0.0032981	total: 22.2s	remaining: 13.4s
180:	learn: 0.0032773	total: 22.3s	remaining: 13.3s
181:	learn: 0.0032618	total: 22.5s	remaining: 13.2s
182:	learn: 0.0032055	total: 22.7s	remaining: 13.1s
183:	learn: 0.0031635	total: 22.8s	remaining: 13s
184:	learn: 0.0031635	total: 23s	remaining: 12.9s
185:	learn: 0.0031634	total: 23.1s	remaining: 12.8s
186:	learn: 0.0031634	total: 23.3s	remaining: 12.7s
187:	learn: 0.0031441	total: 23.5s	remaining: 12.6s
188:	learn: 0.0031237	total: 23.6s	remaining: 12.5s
189:	learn: 0.0031237	total: 23.7s	remaining: 12.4s
190:	learn: 0.0031237	total: 23.8s	remaining: 12.2s
191:	learn: 0.0031237	total: 23.9s	remaining: 12.1s
192:	learn: 0.0031237	total: 23.9s	remaining: 11.9s
193:	learn: 0.0031237	total: 24s	remaining: 11.8s
194:	learn: 0.0031236	total: 24.1s	remaining: 11.6s
195:	learn: 0.0031237	total: 24.2s	remaining: 11.5s
196:	learn: 0.0031237	total: 24.3s	remaining: 11.3s
197:	learn: 0.0031236	total: 24.3s	remaining: 11.2s
198:	learn: 0.0031236	total: 24.4s	remaining: 11.1s
199:	learn: 0.0031236	total: 24.5s	remaining: 10.9s
200:	learn: 0.0031236	total: 24.6s	remaining: 10.8s
201:	learn: 0.0031236	total: 24.6s	remaining: 10.6s
202:	learn: 0.0031236	total: 24.7s	remaining: 10.5s
203:	learn: 0.0031236	total: 24.8s	remaining: 10.3s
204:	learn: 0.0031236	total: 24.9s	remaining: 10.2s
205:	learn: 0.0031236	total: 24.9s	remaining: 10s
206:	learn: 0.0031236	total: 25s	remaining: 9.91s
207:	learn: 0.0031236	total: 25.1s	remaining: 9.77s
208:	learn: 0.0031236	total: 25.1s	remaining: 9.63s
209:	learn: 0.0031236	total: 25.2s	remaining: 9.49s
210:	learn: 0.0031236	total: 25.3s	remaining: 9.37s
211:	learn: 0.0031236	total: 25.4s	remaining: 9.23s
212:	learn: 0.0031236	total: 25.5s	remaining: 9.11s
213:	learn: 0.0031133	total: 25.6s	remaining: 8.97s
214:	learn: 0.0030848	total: 25.7s	remaining: 8.85s
215:	learn: 0.0030848	total: 25.8s	remaining: 8.72s
216:	learn: 0.0030849	total: 25.9s	remaining: 8.58s
217:	learn: 0.0030474	total: 26s	remaining: 8.45s
218:	learn: 0.0030473	total: 26s	remaining: 8.32s
219:	learn: 0.0030473	total: 26.1s	remaining: 8.19s
220:	learn: 0.0030473	total: 26.2s	remaining: 8.06s
221:	learn: 0.0030473	total: 26.3s	remaining: 7.93s
222:	learn: 0.0030473	total: 26.4s	remaining: 7.81s
223:	learn: 0.0030473	total: 26.4s	remaining: 7.67s
224:	learn: 0.0030473	total: 26.5s	remaining: 7.55s
225:	learn: 0.0030473	total: 26.6s	remaining: 7.42s
226:	learn: 0.0030473	total: 26.7s	remaining: 7.29s
227:	learn: 0.0030473	total: 26.8s	remaining: 7.16s
228:	learn: 0.0030473	total: 26.8s	remaining: 7.03s
229:	learn: 0.0030473	total: 26.9s	remaining: 6.9s
230:	learn: 0.0030473	total: 27s	remaining: 6.78s
231:	learn: 0.0030473	total: 27.1s	remaining: 6.65s
232:	learn: 0.0030473	total: 27.2s	remaining: 6.53s
233:	learn: 0.0030473	total: 27.2s	remaining: 6.4s
234:	learn: 0.0030473	total: 27.3s	remaining: 6.28s
235:	learn: 0.0030473	total: 27.4s	remaining: 6.16s
236:	learn: 0.0030473	total: 27.5s	remaining: 6.04s
237:	learn: 0.0030473	total: 27.6s	remaining: 5.91s
238:	learn: 0.0030473	total: 27.6s	remaining: 5.78s
239:	learn: 0.0030473	total: 27.7s	remaining: 5.66s
240:	learn: 0.0030473	total: 27.8s	remaining: 5.54s
241:	learn: 0.0030472	total: 27.9s	remaining: 5.41s
242:	learn: 0.0030472	total: 28s	remaining: 5.3s
243:	learn: 0.0030472	total: 28s	remaining: 5.17s
244:	learn: 0.0030472	total: 28.1s	remaining: 5.05s
245:	learn: 0.0030472	total: 28.2s	remaining: 4.93s
246:	learn: 0.0030472	total: 28.3s	remaining: 4.81s
247:	learn: 0.0030471	total: 28.4s	remaining: 4.69s
248:	learn: 0.0030471	total: 28.5s	remaining: 4.57s
249:	learn: 0.0030471	total: 28.5s	remaining: 4.45s
250:	learn: 0.0030471	total: 28.6s	remaining: 4.33s
251:	learn: 0.0030471	total: 28.7s	remaining: 4.21s
252:	learn: 0.0030471	total: 28.8s	remaining: 4.09s
253:	learn: 0.0030471	total: 28.8s	remaining: 3.97s
254:	learn: 0.0030471	total: 28.9s	remaining: 3.86s
255:	learn: 0.0030471	total: 29s	remaining: 3.74s
256:	learn: 0.0030470	total: 29.1s	remaining: 3.62s
257:	learn: 0.0030470	total: 29.2s	remaining: 3.5s
258:	learn: 0.0030470	total: 29.3s	remaining: 3.39s
259:	learn: 0.0030470	total: 29.4s	remaining: 3.27s
260:	learn: 0.0030470	total: 29.5s	remaining: 3.16s
261:	learn: 0.0030470	total: 29.5s	remaining: 3.04s
262:	learn: 0.0030470	total: 29.6s	remaining: 2.92s
263:	learn: 0.0030470	total: 29.7s	remaining: 2.81s
264:	learn: 0.0030470	total: 29.7s	remaining: 2.69s
265:	learn: 0.0030470	total: 29.8s	remaining: 2.58s
266:	learn: 0.0030470	total: 29.9s	remaining: 2.46s
267:	learn: 0.0030271	total: 30s	remaining: 2.35s
268:	learn: 0.0030064	total: 30.1s	remaining: 2.24s
269:	learn: 0.0029617	total: 30.2s	remaining: 2.12s
270:	learn: 0.0029236	total: 30.3s	remaining: 2.01s
271:	learn: 0.0029040	total: 30.4s	remaining: 1.9s
272:	learn: 0.0029039	total: 30.5s	remaining: 1.79s
273:	learn: 0.0029039	total: 30.6s	remaining: 1.67s
274:	learn: 0.0029039	total: 30.7s	remaining: 1.56s
275:	learn: 0.0029039	total: 30.7s	remaining: 1.45s
276:	learn: 0.0029039	total: 30.8s	remaining: 1.33s
277:	learn: 0.0029039	total: 30.9s	remaining: 1.22s
278:	learn: 0.0029039	total: 30.9s	remaining: 1.11s
279:	learn: 0.0029039	total: 31s	remaining: 997ms
280:	learn: 0.0029039	total: 31.1s	remaining: 886ms
281:	learn: 0.0029039	total: 31.2s	remaining: 774ms
282:	learn: 0.0029039	total: 31.2s	remaining: 663ms
283:	learn: 0.0029039	total: 31.3s	remaining: 552ms
284:	learn: 0.0029039	total: 31.4s	remaining: 441ms
285:	learn: 0.0029038	total: 31.5s	remaining: 330ms
286:	learn: 0.0029038	total: 31.6s	remaining: 220ms
287:	learn: 0.0029038	total: 31.7s	remaining: 110ms
288:	learn: 0.0029038	total: 31.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.73
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 31.90
 - Recall_Test: 88.10
 - AUPRC_Test: 80.26
 - Accuracy_Test: 99.66
 - F1-Score_Test: 46.84
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 289
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.10
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5032748	total: 151ms	remaining: 43.4s
1:	learn: 0.3450861	total: 321ms	remaining: 46s
2:	learn: 0.2444150	total: 503ms	remaining: 48s
3:	learn: 0.1971804	total: 661ms	remaining: 47.1s
4:	learn: 0.1612193	total: 839ms	remaining: 47.7s
5:	learn: 0.1291887	total: 1.02s	remaining: 47.9s
6:	learn: 0.1141675	total: 1.19s	remaining: 47.9s
7:	learn: 0.1013565	total: 1.35s	remaining: 47.4s
8:	learn: 0.0896888	total: 1.54s	remaining: 47.9s
9:	learn: 0.0803334	total: 1.73s	remaining: 48.2s
10:	learn: 0.0750189	total: 1.92s	remaining: 48.4s
11:	learn: 0.0701694	total: 2.08s	remaining: 48s
12:	learn: 0.0667011	total: 2.27s	remaining: 48.3s
13:	learn: 0.0608197	total: 2.44s	remaining: 48s
14:	learn: 0.0577093	total: 2.61s	remaining: 47.6s
15:	learn: 0.0542398	total: 2.8s	remaining: 47.7s
16:	learn: 0.0513838	total: 2.98s	remaining: 47.6s
17:	learn: 0.0491170	total: 3.16s	remaining: 47.5s
18:	learn: 0.0467669	total: 3.34s	remaining: 47.5s
19:	learn: 0.0445570	total: 3.5s	remaining: 47.1s
20:	learn: 0.0421363	total: 3.69s	remaining: 47.1s
21:	learn: 0.0399529	total: 3.87s	remaining: 46.9s
22:	learn: 0.0378306	total: 4.05s	remaining: 46.9s
23:	learn: 0.0363327	total: 4.24s	remaining: 46.8s
24:	learn: 0.0351062	total: 4.42s	remaining: 46.7s
25:	learn: 0.0341631	total: 4.59s	remaining: 46.4s
26:	learn: 0.0326056	total: 4.77s	remaining: 46.3s
27:	learn: 0.0318024	total: 4.93s	remaining: 46s
28:	learn: 0.0306401	total: 5.11s	remaining: 45.8s
29:	learn: 0.0298466	total: 5.29s	remaining: 45.7s
30:	learn: 0.0291806	total: 5.46s	remaining: 45.4s
31:	learn: 0.0278288	total: 5.56s	remaining: 44.6s
32:	learn: 0.0271417	total: 5.65s	remaining: 43.8s
33:	learn: 0.0261828	total: 5.78s	remaining: 43.3s
34:	learn: 0.0254318	total: 5.87s	remaining: 42.6s
35:	learn: 0.0244697	total: 5.96s	remaining: 41.9s
36:	learn: 0.0237771	total: 6.09s	remaining: 41.5s
37:	learn: 0.0229834	total: 6.19s	remaining: 40.9s
38:	learn: 0.0225893	total: 6.27s	remaining: 40.2s
39:	learn: 0.0220558	total: 6.37s	remaining: 39.7s
40:	learn: 0.0210390	total: 6.47s	remaining: 39.1s
41:	learn: 0.0207978	total: 6.55s	remaining: 38.5s
42:	learn: 0.0204518	total: 6.66s	remaining: 38.1s
43:	learn: 0.0196715	total: 6.75s	remaining: 37.6s
44:	learn: 0.0190748	total: 6.86s	remaining: 37.2s
45:	learn: 0.0185322	total: 6.97s	remaining: 36.8s
46:	learn: 0.0182125	total: 7.05s	remaining: 36.3s
47:	learn: 0.0179964	total: 7.13s	remaining: 35.8s
48:	learn: 0.0172644	total: 7.26s	remaining: 35.6s
49:	learn: 0.0168017	total: 7.36s	remaining: 35.2s
50:	learn: 0.0164299	total: 7.44s	remaining: 34.7s
51:	learn: 0.0160596	total: 7.55s	remaining: 34.4s
52:	learn: 0.0153813	total: 7.65s	remaining: 34.1s
53:	learn: 0.0149910	total: 7.74s	remaining: 33.7s
54:	learn: 0.0146977	total: 7.87s	remaining: 33.5s
55:	learn: 0.0143491	total: 7.97s	remaining: 33.1s
56:	learn: 0.0139592	total: 8.06s	remaining: 32.8s
57:	learn: 0.0136786	total: 8.18s	remaining: 32.6s
58:	learn: 0.0133042	total: 8.28s	remaining: 32.3s
59:	learn: 0.0130460	total: 8.36s	remaining: 31.9s
60:	learn: 0.0128159	total: 8.47s	remaining: 31.7s
61:	learn: 0.0125674	total: 8.6s	remaining: 31.5s
62:	learn: 0.0122378	total: 8.79s	remaining: 31.5s
63:	learn: 0.0119989	total: 8.98s	remaining: 31.6s
64:	learn: 0.0116702	total: 9.11s	remaining: 31.4s
65:	learn: 0.0114879	total: 9.23s	remaining: 31.2s
66:	learn: 0.0111851	total: 9.37s	remaining: 31s
67:	learn: 0.0109550	total: 9.45s	remaining: 30.7s
68:	learn: 0.0107194	total: 9.54s	remaining: 30.4s
69:	learn: 0.0104851	total: 9.65s	remaining: 30.2s
70:	learn: 0.0103398	total: 9.73s	remaining: 29.9s
71:	learn: 0.0100761	total: 9.82s	remaining: 29.6s
72:	learn: 0.0099464	total: 9.94s	remaining: 29.4s
73:	learn: 0.0098251	total: 10s	remaining: 29.2s
74:	learn: 0.0096997	total: 10.1s	remaining: 28.9s
75:	learn: 0.0095230	total: 10.3s	remaining: 28.8s
76:	learn: 0.0092820	total: 10.4s	remaining: 28.5s
77:	learn: 0.0090892	total: 10.4s	remaining: 28.2s
78:	learn: 0.0089760	total: 10.5s	remaining: 28s
79:	learn: 0.0088171	total: 10.6s	remaining: 27.8s
80:	learn: 0.0086233	total: 10.7s	remaining: 27.5s
81:	learn: 0.0085357	total: 10.8s	remaining: 27.3s
82:	learn: 0.0083509	total: 10.9s	remaining: 27.1s
83:	learn: 0.0081743	total: 11s	remaining: 26.9s
84:	learn: 0.0079234	total: 11.1s	remaining: 26.7s
85:	learn: 0.0076866	total: 11.2s	remaining: 26.5s
86:	learn: 0.0075416	total: 11.3s	remaining: 26.3s
87:	learn: 0.0074861	total: 11.4s	remaining: 26.1s
88:	learn: 0.0073799	total: 11.5s	remaining: 25.9s
89:	learn: 0.0072260	total: 11.6s	remaining: 25.6s
90:	learn: 0.0071238	total: 11.7s	remaining: 25.4s
91:	learn: 0.0070807	total: 11.8s	remaining: 25.2s
92:	learn: 0.0069389	total: 11.9s	remaining: 25s
93:	learn: 0.0068748	total: 12s	remaining: 24.9s
94:	learn: 0.0068030	total: 12.1s	remaining: 24.7s
95:	learn: 0.0067138	total: 12.2s	remaining: 24.5s
96:	learn: 0.0065728	total: 12.3s	remaining: 24.4s
97:	learn: 0.0064996	total: 12.4s	remaining: 24.2s
98:	learn: 0.0063603	total: 12.5s	remaining: 24s
99:	learn: 0.0062318	total: 12.6s	remaining: 23.8s
100:	learn: 0.0061318	total: 12.7s	remaining: 23.7s
101:	learn: 0.0060785	total: 12.8s	remaining: 23.4s
102:	learn: 0.0060024	total: 12.9s	remaining: 23.3s
103:	learn: 0.0058829	total: 13s	remaining: 23.1s
104:	learn: 0.0058466	total: 13.1s	remaining: 22.9s
105:	learn: 0.0057944	total: 13.2s	remaining: 22.8s
106:	learn: 0.0056976	total: 13.3s	remaining: 22.6s
107:	learn: 0.0056292	total: 13.4s	remaining: 22.4s
108:	learn: 0.0055762	total: 13.5s	remaining: 22.3s
109:	learn: 0.0055175	total: 13.6s	remaining: 22.1s
110:	learn: 0.0054153	total: 13.7s	remaining: 21.9s
111:	learn: 0.0053717	total: 13.8s	remaining: 21.8s
112:	learn: 0.0052601	total: 13.9s	remaining: 21.6s
113:	learn: 0.0051924	total: 14s	remaining: 21.4s
114:	learn: 0.0051387	total: 14.1s	remaining: 21.3s
115:	learn: 0.0050553	total: 14.2s	remaining: 21.2s
116:	learn: 0.0049843	total: 14.3s	remaining: 21s
117:	learn: 0.0049285	total: 14.4s	remaining: 20.8s
118:	learn: 0.0048978	total: 14.5s	remaining: 20.7s
119:	learn: 0.0048402	total: 14.6s	remaining: 20.5s
120:	learn: 0.0047871	total: 14.7s	remaining: 20.4s
121:	learn: 0.0047015	total: 14.8s	remaining: 20.2s
122:	learn: 0.0046145	total: 14.9s	remaining: 20.1s
123:	learn: 0.0045825	total: 15s	remaining: 20s
124:	learn: 0.0045071	total: 15.1s	remaining: 19.8s
125:	learn: 0.0044651	total: 15.2s	remaining: 19.6s
126:	learn: 0.0043999	total: 15.3s	remaining: 19.5s
127:	learn: 0.0043752	total: 15.4s	remaining: 19.3s
128:	learn: 0.0042981	total: 15.5s	remaining: 19.2s
129:	learn: 0.0042305	total: 15.6s	remaining: 19.1s
130:	learn: 0.0041739	total: 15.8s	remaining: 19.1s
131:	learn: 0.0041162	total: 16s	remaining: 19s
132:	learn: 0.0041018	total: 16.2s	remaining: 19s
133:	learn: 0.0040830	total: 16.3s	remaining: 18.9s
134:	learn: 0.0040133	total: 16.5s	remaining: 18.8s
135:	learn: 0.0039670	total: 16.7s	remaining: 18.8s
136:	learn: 0.0038891	total: 16.9s	remaining: 18.7s
137:	learn: 0.0038595	total: 17.1s	remaining: 18.7s
138:	learn: 0.0038043	total: 17.3s	remaining: 18.6s
139:	learn: 0.0037601	total: 17.4s	remaining: 18.5s
140:	learn: 0.0037462	total: 17.6s	remaining: 18.4s
141:	learn: 0.0036992	total: 17.7s	remaining: 18.4s
142:	learn: 0.0036482	total: 17.9s	remaining: 18.3s
143:	learn: 0.0035991	total: 18.1s	remaining: 18.2s
144:	learn: 0.0035657	total: 18.3s	remaining: 18.1s
145:	learn: 0.0035163	total: 18.4s	remaining: 18.1s
146:	learn: 0.0034809	total: 18.6s	remaining: 18s
147:	learn: 0.0034676	total: 18.8s	remaining: 17.9s
148:	learn: 0.0034676	total: 18.9s	remaining: 17.8s
149:	learn: 0.0034533	total: 19.1s	remaining: 17.7s
150:	learn: 0.0034150	total: 19.2s	remaining: 17.6s
151:	learn: 0.0034151	total: 19.4s	remaining: 17.5s
152:	learn: 0.0034032	total: 19.5s	remaining: 17.3s
153:	learn: 0.0034031	total: 19.6s	remaining: 17.2s
154:	learn: 0.0033754	total: 19.8s	remaining: 17.1s
155:	learn: 0.0033754	total: 19.9s	remaining: 17s
156:	learn: 0.0033372	total: 20.1s	remaining: 16.9s
157:	learn: 0.0032932	total: 20.3s	remaining: 16.8s
158:	learn: 0.0032932	total: 20.4s	remaining: 16.7s
159:	learn: 0.0032932	total: 20.6s	remaining: 16.6s
160:	learn: 0.0032714	total: 20.7s	remaining: 16.5s
161:	learn: 0.0032714	total: 20.9s	remaining: 16.4s
162:	learn: 0.0032634	total: 21s	remaining: 16.3s
163:	learn: 0.0032391	total: 21.1s	remaining: 16.1s
164:	learn: 0.0032391	total: 21.2s	remaining: 15.9s
165:	learn: 0.0032391	total: 21.3s	remaining: 15.8s
166:	learn: 0.0031898	total: 21.4s	remaining: 15.6s
167:	learn: 0.0031897	total: 21.5s	remaining: 15.5s
168:	learn: 0.0031795	total: 21.6s	remaining: 15.3s
169:	learn: 0.0031794	total: 21.7s	remaining: 15.2s
170:	learn: 0.0031735	total: 21.7s	remaining: 15s
171:	learn: 0.0031735	total: 21.8s	remaining: 14.8s
172:	learn: 0.0031735	total: 21.9s	remaining: 14.7s
173:	learn: 0.0031442	total: 22s	remaining: 14.5s
174:	learn: 0.0031442	total: 22.1s	remaining: 14.4s
175:	learn: 0.0031094	total: 22.2s	remaining: 14.2s
176:	learn: 0.0030938	total: 22.3s	remaining: 14.1s
177:	learn: 0.0030938	total: 22.3s	remaining: 13.9s
178:	learn: 0.0030938	total: 22.4s	remaining: 13.8s
179:	learn: 0.0030417	total: 22.5s	remaining: 13.6s
180:	learn: 0.0030043	total: 22.6s	remaining: 13.5s
181:	learn: 0.0030043	total: 22.7s	remaining: 13.4s
182:	learn: 0.0030043	total: 22.8s	remaining: 13.2s
183:	learn: 0.0030043	total: 22.9s	remaining: 13s
184:	learn: 0.0030043	total: 22.9s	remaining: 12.9s
185:	learn: 0.0030043	total: 23s	remaining: 12.7s
186:	learn: 0.0030043	total: 23.1s	remaining: 12.6s
187:	learn: 0.0030043	total: 23.2s	remaining: 12.5s
188:	learn: 0.0030043	total: 23.3s	remaining: 12.3s
189:	learn: 0.0030043	total: 23.4s	remaining: 12.2s
190:	learn: 0.0030043	total: 23.4s	remaining: 12s
191:	learn: 0.0030043	total: 23.5s	remaining: 11.9s
192:	learn: 0.0030043	total: 23.6s	remaining: 11.7s
193:	learn: 0.0030043	total: 23.7s	remaining: 11.6s
194:	learn: 0.0030043	total: 23.7s	remaining: 11.4s
195:	learn: 0.0030043	total: 23.8s	remaining: 11.3s
196:	learn: 0.0030043	total: 23.9s	remaining: 11.2s
197:	learn: 0.0030043	total: 24s	remaining: 11s
198:	learn: 0.0030043	total: 24.1s	remaining: 10.9s
199:	learn: 0.0030043	total: 24.1s	remaining: 10.7s
200:	learn: 0.0030043	total: 24.2s	remaining: 10.6s
201:	learn: 0.0030043	total: 24.3s	remaining: 10.5s
202:	learn: 0.0030043	total: 24.4s	remaining: 10.3s
203:	learn: 0.0030042	total: 24.4s	remaining: 10.2s
204:	learn: 0.0030043	total: 24.6s	remaining: 10.1s
205:	learn: 0.0030042	total: 24.6s	remaining: 9.92s
206:	learn: 0.0030042	total: 24.7s	remaining: 9.78s
207:	learn: 0.0030042	total: 24.8s	remaining: 9.65s
208:	learn: 0.0030042	total: 24.9s	remaining: 9.51s
209:	learn: 0.0030042	total: 24.9s	remaining: 9.38s
210:	learn: 0.0030042	total: 25s	remaining: 9.24s
211:	learn: 0.0030041	total: 25.1s	remaining: 9.11s
212:	learn: 0.0030041	total: 25.2s	remaining: 8.97s
213:	learn: 0.0030041	total: 25.2s	remaining: 8.85s
214:	learn: 0.0030041	total: 25.3s	remaining: 8.71s
215:	learn: 0.0030041	total: 25.4s	remaining: 8.58s
216:	learn: 0.0030041	total: 25.5s	remaining: 8.45s
217:	learn: 0.0030041	total: 25.6s	remaining: 8.32s
218:	learn: 0.0030041	total: 25.6s	remaining: 8.19s
219:	learn: 0.0030041	total: 25.7s	remaining: 8.07s
220:	learn: 0.0030041	total: 25.8s	remaining: 7.94s
221:	learn: 0.0030041	total: 25.9s	remaining: 7.8s
222:	learn: 0.0030041	total: 25.9s	remaining: 7.68s
223:	learn: 0.0030040	total: 26s	remaining: 7.55s
224:	learn: 0.0030040	total: 26.1s	remaining: 7.42s
225:	learn: 0.0030040	total: 26.2s	remaining: 7.3s
226:	learn: 0.0030040	total: 26.3s	remaining: 7.17s
227:	learn: 0.0030040	total: 26.3s	remaining: 7.05s
228:	learn: 0.0030040	total: 26.4s	remaining: 6.92s
229:	learn: 0.0030040	total: 26.5s	remaining: 6.8s
230:	learn: 0.0030040	total: 26.6s	remaining: 6.67s
231:	learn: 0.0030040	total: 26.7s	remaining: 6.56s
232:	learn: 0.0030040	total: 26.8s	remaining: 6.44s
233:	learn: 0.0030040	total: 26.9s	remaining: 6.31s
234:	learn: 0.0030040	total: 26.9s	remaining: 6.19s
235:	learn: 0.0030040	total: 27s	remaining: 6.07s
236:	learn: 0.0030040	total: 27.1s	remaining: 5.95s
237:	learn: 0.0030040	total: 27.2s	remaining: 5.82s
238:	learn: 0.0030040	total: 27.3s	remaining: 5.7s
239:	learn: 0.0030040	total: 27.3s	remaining: 5.58s
240:	learn: 0.0030040	total: 27.4s	remaining: 5.46s
241:	learn: 0.0030040	total: 27.5s	remaining: 5.34s
242:	learn: 0.0030040	total: 27.6s	remaining: 5.22s
243:	learn: 0.0030040	total: 27.7s	remaining: 5.1s
244:	learn: 0.0030040	total: 27.7s	remaining: 4.98s
245:	learn: 0.0030040	total: 27.8s	remaining: 4.86s
246:	learn: 0.0030040	total: 27.9s	remaining: 4.74s
247:	learn: 0.0030040	total: 28s	remaining: 4.62s
248:	learn: 0.0030040	total: 28s	remaining: 4.5s
249:	learn: 0.0030040	total: 28.2s	remaining: 4.39s
250:	learn: 0.0030040	total: 28.2s	remaining: 4.27s
251:	learn: 0.0030040	total: 28.3s	remaining: 4.16s
252:	learn: 0.0030040	total: 28.4s	remaining: 4.04s
253:	learn: 0.0030040	total: 28.5s	remaining: 3.93s
254:	learn: 0.0030040	total: 28.6s	remaining: 3.81s
255:	learn: 0.0030040	total: 28.7s	remaining: 3.69s
256:	learn: 0.0030040	total: 28.7s	remaining: 3.58s
257:	learn: 0.0030040	total: 28.8s	remaining: 3.46s
258:	learn: 0.0030040	total: 28.9s	remaining: 3.35s
259:	learn: 0.0030040	total: 29s	remaining: 3.23s
260:	learn: 0.0030041	total: 29s	remaining: 3.12s
261:	learn: 0.0030041	total: 29.1s	remaining: 3s
262:	learn: 0.0030041	total: 29.2s	remaining: 2.89s
263:	learn: 0.0030041	total: 29.3s	remaining: 2.77s
264:	learn: 0.0030041	total: 29.3s	remaining: 2.66s
265:	learn: 0.0030041	total: 29.4s	remaining: 2.55s
266:	learn: 0.0030041	total: 29.5s	remaining: 2.43s
267:	learn: 0.0030041	total: 29.6s	remaining: 2.32s
268:	learn: 0.0030041	total: 29.7s	remaining: 2.21s
269:	learn: 0.0030041	total: 29.8s	remaining: 2.09s
270:	learn: 0.0030041	total: 29.9s	remaining: 1.98s
271:	learn: 0.0030041	total: 29.9s	remaining: 1.87s
272:	learn: 0.0030041	total: 30s	remaining: 1.76s
273:	learn: 0.0030041	total: 30.1s	remaining: 1.65s
274:	learn: 0.0030041	total: 30.2s	remaining: 1.53s
275:	learn: 0.0030041	total: 30.2s	remaining: 1.42s
276:	learn: 0.0030041	total: 30.3s	remaining: 1.31s
277:	learn: 0.0030041	total: 30.4s	remaining: 1.2s
278:	learn: 0.0030041	total: 30.5s	remaining: 1.09s
279:	learn: 0.0030041	total: 30.5s	remaining: 981ms
280:	learn: 0.0030041	total: 30.7s	remaining: 873ms
281:	learn: 0.0030041	total: 30.8s	remaining: 763ms
282:	learn: 0.0030041	total: 30.8s	remaining: 654ms
283:	learn: 0.0030041	total: 30.9s	remaining: 544ms
284:	learn: 0.0030041	total: 31s	remaining: 435ms
285:	learn: 0.0030041	total: 31.1s	remaining: 326ms
286:	learn: 0.0030041	total: 31.2s	remaining: 218ms
287:	learn: 0.0030041	total: 31.4s	remaining: 109ms
288:	learn: 0.0030041	total: 31.5s	remaining: 0us
[I 2024-12-19 15:04:03,668] Trial 42 finished with value: 80.1027112131879 and parameters: {'learning_rate': 0.0869330948309761, 'max_depth': 6, 'n_estimators': 289, 'scale_pos_weight': 6.103771581805754}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.86
 - Precision_Test: 29.92
 - Recall_Test: 85.71
 - AUPRC_Test: 79.95
 - Accuracy_Test: 99.64
 - F1-Score_Test: 44.35
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 289
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.10
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 80.1027

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5053669	total: 164ms	remaining: 47.1s
1:	learn: 0.3444938	total: 334ms	remaining: 47.7s
2:	learn: 0.2510589	total: 529ms	remaining: 50.2s
3:	learn: 0.1848036	total: 707ms	remaining: 50.2s
4:	learn: 0.1483785	total: 903ms	remaining: 51.1s
5:	learn: 0.1190417	total: 1.08s	remaining: 50.7s
6:	learn: 0.1014975	total: 1.25s	remaining: 50.4s
7:	learn: 0.0860082	total: 1.44s	remaining: 50.3s
8:	learn: 0.0765834	total: 1.61s	remaining: 50s
9:	learn: 0.0680260	total: 1.73s	remaining: 48.1s
10:	learn: 0.0629482	total: 1.82s	remaining: 45.8s
11:	learn: 0.0572913	total: 1.96s	remaining: 45.1s
12:	learn: 0.0528977	total: 2.09s	remaining: 44.2s
13:	learn: 0.0496824	total: 2.19s	remaining: 42.8s
14:	learn: 0.0469038	total: 2.27s	remaining: 41.4s
15:	learn: 0.0434008	total: 2.39s	remaining: 40.7s
16:	learn: 0.0410598	total: 2.48s	remaining: 39.6s
17:	learn: 0.0388770	total: 2.59s	remaining: 38.8s
18:	learn: 0.0367119	total: 2.71s	remaining: 38.3s
19:	learn: 0.0353941	total: 2.8s	remaining: 37.5s
20:	learn: 0.0339864	total: 2.88s	remaining: 36.6s
21:	learn: 0.0327812	total: 3s	remaining: 36.3s
22:	learn: 0.0316683	total: 3.11s	remaining: 35.8s
23:	learn: 0.0305291	total: 3.2s	remaining: 35.2s
24:	learn: 0.0294802	total: 3.31s	remaining: 34.8s
25:	learn: 0.0287044	total: 3.39s	remaining: 34.2s
26:	learn: 0.0278011	total: 3.48s	remaining: 33.6s
27:	learn: 0.0265444	total: 3.59s	remaining: 33.3s
28:	learn: 0.0257993	total: 3.67s	remaining: 32.8s
29:	learn: 0.0251372	total: 3.75s	remaining: 32.3s
30:	learn: 0.0241332	total: 3.86s	remaining: 32s
31:	learn: 0.0234754	total: 3.95s	remaining: 31.6s
32:	learn: 0.0225538	total: 4.04s	remaining: 31.3s
33:	learn: 0.0218905	total: 4.16s	remaining: 31.1s
34:	learn: 0.0210057	total: 4.25s	remaining: 30.7s
35:	learn: 0.0203079	total: 4.34s	remaining: 30.4s
36:	learn: 0.0197021	total: 4.45s	remaining: 30.2s
37:	learn: 0.0189212	total: 4.54s	remaining: 29.9s
38:	learn: 0.0185354	total: 4.63s	remaining: 29.6s
39:	learn: 0.0181290	total: 4.74s	remaining: 29.4s
40:	learn: 0.0177804	total: 4.83s	remaining: 29.1s
41:	learn: 0.0171771	total: 4.91s	remaining: 28.8s
42:	learn: 0.0168437	total: 5.01s	remaining: 28.6s
43:	learn: 0.0163329	total: 5.12s	remaining: 28.4s
44:	learn: 0.0159545	total: 5.21s	remaining: 28.1s
45:	learn: 0.0155443	total: 5.32s	remaining: 28s
46:	learn: 0.0150959	total: 5.41s	remaining: 27.7s
47:	learn: 0.0148241	total: 5.49s	remaining: 27.5s
48:	learn: 0.0142890	total: 5.62s	remaining: 27.4s
49:	learn: 0.0139299	total: 5.7s	remaining: 27.2s
50:	learn: 0.0135184	total: 5.79s	remaining: 26.9s
51:	learn: 0.0131323	total: 5.9s	remaining: 26.8s
52:	learn: 0.0128638	total: 5.99s	remaining: 26.6s
53:	learn: 0.0126206	total: 6.08s	remaining: 26.4s
54:	learn: 0.0122786	total: 6.21s	remaining: 26.3s
55:	learn: 0.0121201	total: 6.29s	remaining: 26s
56:	learn: 0.0118980	total: 6.37s	remaining: 25.8s
57:	learn: 0.0117279	total: 6.48s	remaining: 25.7s
58:	learn: 0.0114840	total: 6.58s	remaining: 25.6s
59:	learn: 0.0112120	total: 6.68s	remaining: 25.4s
60:	learn: 0.0110341	total: 6.79s	remaining: 25.3s
61:	learn: 0.0106556	total: 6.87s	remaining: 25.1s
62:	learn: 0.0103951	total: 7s	remaining: 25s
63:	learn: 0.0101771	total: 7.09s	remaining: 24.8s
64:	learn: 0.0099590	total: 7.19s	remaining: 24.7s
65:	learn: 0.0097106	total: 7.3s	remaining: 24.6s
66:	learn: 0.0095084	total: 7.4s	remaining: 24.4s
67:	learn: 0.0092581	total: 7.49s	remaining: 24.2s
68:	learn: 0.0090868	total: 7.6s	remaining: 24.1s
69:	learn: 0.0088977	total: 7.69s	remaining: 23.9s
70:	learn: 0.0087371	total: 7.77s	remaining: 23.8s
71:	learn: 0.0085865	total: 7.88s	remaining: 23.6s
72:	learn: 0.0084767	total: 7.97s	remaining: 23.5s
73:	learn: 0.0083176	total: 8.06s	remaining: 23.3s
74:	learn: 0.0081480	total: 8.18s	remaining: 23.2s
75:	learn: 0.0080141	total: 8.27s	remaining: 23.1s
76:	learn: 0.0079215	total: 8.35s	remaining: 22.9s
77:	learn: 0.0078002	total: 8.46s	remaining: 22.8s
78:	learn: 0.0076735	total: 8.54s	remaining: 22.6s
79:	learn: 0.0075876	total: 8.63s	remaining: 22.4s
80:	learn: 0.0075079	total: 8.73s	remaining: 22.3s
81:	learn: 0.0073729	total: 8.82s	remaining: 22.2s
82:	learn: 0.0072659	total: 8.9s	remaining: 22s
83:	learn: 0.0072036	total: 9.02s	remaining: 21.9s
84:	learn: 0.0071268	total: 9.1s	remaining: 21.7s
85:	learn: 0.0069997	total: 9.2s	remaining: 21.6s
86:	learn: 0.0067700	total: 9.32s	remaining: 21.5s
87:	learn: 0.0066288	total: 9.4s	remaining: 21.4s
88:	learn: 0.0065254	total: 9.49s	remaining: 21.2s
89:	learn: 0.0064094	total: 9.6s	remaining: 21.1s
90:	learn: 0.0063324	total: 9.69s	remaining: 21s
91:	learn: 0.0062280	total: 9.77s	remaining: 20.8s
92:	learn: 0.0061140	total: 9.88s	remaining: 20.7s
93:	learn: 0.0060418	total: 9.97s	remaining: 20.6s
94:	learn: 0.0059624	total: 10s	remaining: 20.4s
95:	learn: 0.0057794	total: 10.2s	remaining: 20.3s
96:	learn: 0.0056496	total: 10.3s	remaining: 20.2s
97:	learn: 0.0055665	total: 10.4s	remaining: 20.1s
98:	learn: 0.0055037	total: 10.5s	remaining: 20s
99:	learn: 0.0054036	total: 10.5s	remaining: 19.8s
100:	learn: 0.0053602	total: 10.6s	remaining: 19.7s
101:	learn: 0.0052711	total: 10.7s	remaining: 19.6s
102:	learn: 0.0052243	total: 10.8s	remaining: 19.4s
103:	learn: 0.0050937	total: 10.9s	remaining: 19.3s
104:	learn: 0.0049998	total: 11.1s	remaining: 19.3s
105:	learn: 0.0049607	total: 11.2s	remaining: 19.2s
106:	learn: 0.0048772	total: 11.3s	remaining: 19s
107:	learn: 0.0048456	total: 11.4s	remaining: 18.9s
108:	learn: 0.0047986	total: 11.4s	remaining: 18.8s
109:	learn: 0.0047564	total: 11.5s	remaining: 18.6s
110:	learn: 0.0047011	total: 11.6s	remaining: 18.6s
111:	learn: 0.0046773	total: 11.8s	remaining: 18.5s
112:	learn: 0.0046416	total: 11.9s	remaining: 18.5s
113:	learn: 0.0046334	total: 12.1s	remaining: 18.4s
114:	learn: 0.0046019	total: 12.2s	remaining: 18.4s
115:	learn: 0.0045797	total: 12.4s	remaining: 18.4s
116:	learn: 0.0045186	total: 12.6s	remaining: 18.4s
117:	learn: 0.0044659	total: 12.7s	remaining: 18.3s
118:	learn: 0.0043890	total: 12.9s	remaining: 18.3s
119:	learn: 0.0043703	total: 13.1s	remaining: 18.3s
120:	learn: 0.0043703	total: 13.2s	remaining: 18.3s
121:	learn: 0.0043364	total: 13.4s	remaining: 18.2s
122:	learn: 0.0042187	total: 13.6s	remaining: 18.2s
123:	learn: 0.0041955	total: 13.7s	remaining: 18.1s
124:	learn: 0.0041678	total: 13.9s	remaining: 18.1s
125:	learn: 0.0041187	total: 14.1s	remaining: 18.1s
126:	learn: 0.0040700	total: 14.3s	remaining: 18.1s
127:	learn: 0.0040508	total: 14.4s	remaining: 18s
128:	learn: 0.0040047	total: 14.6s	remaining: 18s
129:	learn: 0.0039523	total: 14.8s	remaining: 17.9s
130:	learn: 0.0039364	total: 14.9s	remaining: 17.9s
131:	learn: 0.0039038	total: 15.1s	remaining: 17.8s
132:	learn: 0.0038808	total: 15.2s	remaining: 17.8s
133:	learn: 0.0038561	total: 15.4s	remaining: 17.7s
134:	learn: 0.0038295	total: 15.6s	remaining: 17.7s
135:	learn: 0.0038294	total: 15.7s	remaining: 17.6s
136:	learn: 0.0037926	total: 15.9s	remaining: 17.5s
137:	learn: 0.0037750	total: 16.1s	remaining: 17.5s
138:	learn: 0.0037497	total: 16.3s	remaining: 17.4s
139:	learn: 0.0037146	total: 16.4s	remaining: 17.4s
140:	learn: 0.0037146	total: 16.6s	remaining: 17.3s
141:	learn: 0.0036674	total: 16.7s	remaining: 17.2s
142:	learn: 0.0035931	total: 16.9s	remaining: 17.2s
143:	learn: 0.0035592	total: 17.1s	remaining: 17.1s
144:	learn: 0.0035592	total: 17.2s	remaining: 17s
145:	learn: 0.0035592	total: 17.4s	remaining: 16.9s
146:	learn: 0.0035282	total: 17.5s	remaining: 16.8s
147:	learn: 0.0035037	total: 17.6s	remaining: 16.6s
148:	learn: 0.0034958	total: 17.7s	remaining: 16.5s
149:	learn: 0.0034958	total: 17.8s	remaining: 16.3s
150:	learn: 0.0034958	total: 17.8s	remaining: 16.2s
151:	learn: 0.0034627	total: 17.9s	remaining: 16s
152:	learn: 0.0034627	total: 18s	remaining: 15.9s
153:	learn: 0.0034358	total: 18.1s	remaining: 15.7s
154:	learn: 0.0034358	total: 18.2s	remaining: 15.6s
155:	learn: 0.0034358	total: 18.2s	remaining: 15.4s
156:	learn: 0.0034358	total: 18.3s	remaining: 15.3s
157:	learn: 0.0034358	total: 18.4s	remaining: 15.1s
158:	learn: 0.0034358	total: 18.4s	remaining: 15s
159:	learn: 0.0034358	total: 18.5s	remaining: 14.8s
160:	learn: 0.0034358	total: 18.6s	remaining: 14.7s
161:	learn: 0.0034358	total: 18.7s	remaining: 14.5s
162:	learn: 0.0034100	total: 18.8s	remaining: 14.4s
163:	learn: 0.0034100	total: 18.8s	remaining: 14.2s
164:	learn: 0.0034100	total: 18.9s	remaining: 14.1s
165:	learn: 0.0034099	total: 19s	remaining: 14s
166:	learn: 0.0033970	total: 19.1s	remaining: 13.8s
167:	learn: 0.0033970	total: 19.1s	remaining: 13.7s
168:	learn: 0.0033970	total: 19.2s	remaining: 13.5s
169:	learn: 0.0033970	total: 19.3s	remaining: 13.4s
170:	learn: 0.0033970	total: 19.4s	remaining: 13.2s
171:	learn: 0.0033970	total: 19.4s	remaining: 13.1s
172:	learn: 0.0033850	total: 19.5s	remaining: 13s
173:	learn: 0.0033849	total: 19.6s	remaining: 12.8s
174:	learn: 0.0033737	total: 19.7s	remaining: 12.7s
175:	learn: 0.0033177	total: 19.8s	remaining: 12.6s
176:	learn: 0.0033177	total: 19.9s	remaining: 12.4s
177:	learn: 0.0032975	total: 19.9s	remaining: 12.3s
178:	learn: 0.0032495	total: 20.1s	remaining: 12.2s
179:	learn: 0.0032152	total: 20.1s	remaining: 12.1s
180:	learn: 0.0032152	total: 20.2s	remaining: 11.9s
181:	learn: 0.0032152	total: 20.3s	remaining: 11.8s
182:	learn: 0.0032152	total: 20.3s	remaining: 11.7s
183:	learn: 0.0032152	total: 20.4s	remaining: 11.5s
184:	learn: 0.0032152	total: 20.5s	remaining: 11.4s
185:	learn: 0.0032152	total: 20.6s	remaining: 11.3s
186:	learn: 0.0032151	total: 20.7s	remaining: 11.2s
187:	learn: 0.0032151	total: 20.8s	remaining: 11s
188:	learn: 0.0032151	total: 20.8s	remaining: 10.9s
189:	learn: 0.0032151	total: 20.9s	remaining: 10.8s
190:	learn: 0.0032151	total: 21s	remaining: 10.6s
191:	learn: 0.0032151	total: 21s	remaining: 10.5s
192:	learn: 0.0032151	total: 21.1s	remaining: 10.4s
193:	learn: 0.0032151	total: 21.2s	remaining: 10.3s
194:	learn: 0.0032151	total: 21.2s	remaining: 10.1s
195:	learn: 0.0032151	total: 21.3s	remaining: 10s
196:	learn: 0.0032150	total: 21.4s	remaining: 9.88s
197:	learn: 0.0032150	total: 21.5s	remaining: 9.75s
198:	learn: 0.0032150	total: 21.5s	remaining: 9.63s
199:	learn: 0.0032150	total: 21.6s	remaining: 9.51s
200:	learn: 0.0032150	total: 21.7s	remaining: 9.38s
201:	learn: 0.0032150	total: 21.8s	remaining: 9.27s
202:	learn: 0.0032150	total: 21.8s	remaining: 9.14s
203:	learn: 0.0032150	total: 21.9s	remaining: 9.03s
204:	learn: 0.0032150	total: 22s	remaining: 8.9s
205:	learn: 0.0032150	total: 22.1s	remaining: 8.79s
206:	learn: 0.0032150	total: 22.1s	remaining: 8.66s
207:	learn: 0.0032150	total: 22.2s	remaining: 8.54s
208:	learn: 0.0032150	total: 22.3s	remaining: 8.42s
209:	learn: 0.0032150	total: 22.4s	remaining: 8.3s
210:	learn: 0.0032150	total: 22.4s	remaining: 8.18s
211:	learn: 0.0032150	total: 22.5s	remaining: 8.06s
212:	learn: 0.0032150	total: 22.6s	remaining: 7.94s
213:	learn: 0.0032150	total: 22.6s	remaining: 7.83s
214:	learn: 0.0032150	total: 22.7s	remaining: 7.71s
215:	learn: 0.0031988	total: 22.8s	remaining: 7.6s
216:	learn: 0.0031988	total: 22.9s	remaining: 7.49s
217:	learn: 0.0031987	total: 23s	remaining: 7.37s
218:	learn: 0.0031987	total: 23s	remaining: 7.25s
219:	learn: 0.0031988	total: 23.1s	remaining: 7.14s
220:	learn: 0.0031987	total: 23.2s	remaining: 7.03s
221:	learn: 0.0031987	total: 23.3s	remaining: 6.92s
222:	learn: 0.0031987	total: 23.4s	remaining: 6.81s
223:	learn: 0.0031987	total: 23.4s	remaining: 6.69s
224:	learn: 0.0031987	total: 23.5s	remaining: 6.58s
225:	learn: 0.0031987	total: 23.5s	remaining: 6.46s
226:	learn: 0.0031987	total: 23.6s	remaining: 6.35s
227:	learn: 0.0031987	total: 23.7s	remaining: 6.24s
228:	learn: 0.0031987	total: 23.8s	remaining: 6.13s
229:	learn: 0.0031986	total: 23.8s	remaining: 6.01s
230:	learn: 0.0031987	total: 23.9s	remaining: 5.91s
231:	learn: 0.0031986	total: 24s	remaining: 5.79s
232:	learn: 0.0031619	total: 24.1s	remaining: 5.69s
233:	learn: 0.0031523	total: 24.2s	remaining: 5.58s
234:	learn: 0.0031523	total: 24.3s	remaining: 5.47s
235:	learn: 0.0031523	total: 24.3s	remaining: 5.36s
236:	learn: 0.0031523	total: 24.4s	remaining: 5.25s
237:	learn: 0.0031523	total: 24.5s	remaining: 5.14s
238:	learn: 0.0031523	total: 24.6s	remaining: 5.04s
239:	learn: 0.0031523	total: 24.7s	remaining: 4.93s
240:	learn: 0.0031523	total: 24.7s	remaining: 4.82s
241:	learn: 0.0031523	total: 24.8s	remaining: 4.72s
242:	learn: 0.0031523	total: 24.9s	remaining: 4.61s
243:	learn: 0.0031523	total: 25s	remaining: 4.5s
244:	learn: 0.0031523	total: 25s	remaining: 4.39s
245:	learn: 0.0031523	total: 25.1s	remaining: 4.29s
246:	learn: 0.0031523	total: 25.2s	remaining: 4.18s
247:	learn: 0.0031523	total: 25.3s	remaining: 4.07s
248:	learn: 0.0031523	total: 25.3s	remaining: 3.97s
249:	learn: 0.0031523	total: 25.4s	remaining: 3.86s
250:	learn: 0.0031523	total: 25.5s	remaining: 3.75s
251:	learn: 0.0031523	total: 25.6s	remaining: 3.65s
252:	learn: 0.0031523	total: 25.6s	remaining: 3.54s
253:	learn: 0.0031523	total: 25.7s	remaining: 3.44s
254:	learn: 0.0031523	total: 25.8s	remaining: 3.33s
255:	learn: 0.0031523	total: 25.8s	remaining: 3.23s
256:	learn: 0.0031523	total: 25.9s	remaining: 3.13s
257:	learn: 0.0031523	total: 26s	remaining: 3.02s
258:	learn: 0.0031523	total: 26.1s	remaining: 2.92s
259:	learn: 0.0031523	total: 26.1s	remaining: 2.81s
260:	learn: 0.0031523	total: 26.2s	remaining: 2.71s
261:	learn: 0.0031523	total: 26.3s	remaining: 2.61s
262:	learn: 0.0031523	total: 26.3s	remaining: 2.5s
263:	learn: 0.0031523	total: 26.4s	remaining: 2.4s
264:	learn: 0.0031523	total: 26.5s	remaining: 2.3s
265:	learn: 0.0031523	total: 26.6s	remaining: 2.2s
266:	learn: 0.0031523	total: 26.6s	remaining: 2.09s
267:	learn: 0.0031523	total: 26.7s	remaining: 1.99s
268:	learn: 0.0031523	total: 26.8s	remaining: 1.89s
269:	learn: 0.0031523	total: 26.8s	remaining: 1.79s
270:	learn: 0.0031523	total: 26.9s	remaining: 1.69s
271:	learn: 0.0031523	total: 27s	remaining: 1.59s
272:	learn: 0.0031523	total: 27.1s	remaining: 1.49s
273:	learn: 0.0031523	total: 27.1s	remaining: 1.39s
274:	learn: 0.0031523	total: 27.2s	remaining: 1.28s
275:	learn: 0.0031523	total: 27.3s	remaining: 1.19s
276:	learn: 0.0031523	total: 27.3s	remaining: 1.08s
277:	learn: 0.0031523	total: 27.4s	remaining: 986ms
278:	learn: 0.0031523	total: 27.5s	remaining: 888ms
279:	learn: 0.0031523	total: 27.6s	remaining: 790ms
280:	learn: 0.0031523	total: 27.8s	remaining: 692ms
281:	learn: 0.0031523	total: 27.9s	remaining: 593ms
282:	learn: 0.0031523	total: 28s	remaining: 495ms
283:	learn: 0.0031523	total: 28.1s	remaining: 396ms
284:	learn: 0.0031523	total: 28.3s	remaining: 298ms
285:	learn: 0.0031523	total: 28.4s	remaining: 199ms
286:	learn: 0.0031523	total: 28.5s	remaining: 99.4ms
287:	learn: 0.0031523	total: 28.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 26.29
 - Recall_Test: 84.92
 - AUPRC_Test: 80.18
 - Accuracy_Test: 99.57
 - F1-Score_Test: 40.15
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 288
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5049969	total: 188ms	remaining: 54.1s
1:	learn: 0.3745148	total: 377ms	remaining: 53.8s
2:	learn: 0.2812619	total: 549ms	remaining: 52.2s
3:	learn: 0.2202280	total: 735ms	remaining: 52.2s
4:	learn: 0.1816524	total: 895ms	remaining: 50.7s
5:	learn: 0.1499205	total: 989ms	remaining: 46.5s
6:	learn: 0.1313730	total: 1.08s	remaining: 43.3s
7:	learn: 0.1185878	total: 1.19s	remaining: 41.6s
8:	learn: 0.1094067	total: 1.29s	remaining: 39.9s
9:	learn: 0.1004913	total: 1.37s	remaining: 38.2s
10:	learn: 0.0914948	total: 1.49s	remaining: 37.6s
11:	learn: 0.0854663	total: 1.58s	remaining: 36.3s
12:	learn: 0.0791212	total: 1.67s	remaining: 35.2s
13:	learn: 0.0756168	total: 1.79s	remaining: 35.1s
14:	learn: 0.0687576	total: 1.89s	remaining: 34.4s
15:	learn: 0.0642582	total: 1.98s	remaining: 33.7s
16:	learn: 0.0610855	total: 2.1s	remaining: 33.5s
17:	learn: 0.0583882	total: 2.2s	remaining: 32.9s
18:	learn: 0.0556799	total: 2.3s	remaining: 32.6s
19:	learn: 0.0538771	total: 2.41s	remaining: 32.3s
20:	learn: 0.0519406	total: 2.5s	remaining: 31.8s
21:	learn: 0.0491158	total: 2.59s	remaining: 31.4s
22:	learn: 0.0474908	total: 2.71s	remaining: 31.2s
23:	learn: 0.0458283	total: 2.8s	remaining: 30.8s
24:	learn: 0.0438915	total: 2.89s	remaining: 30.4s
25:	learn: 0.0416919	total: 3s	remaining: 30.2s
26:	learn: 0.0402924	total: 3.08s	remaining: 29.8s
27:	learn: 0.0390551	total: 3.18s	remaining: 29.5s
28:	learn: 0.0379965	total: 3.3s	remaining: 29.5s
29:	learn: 0.0365970	total: 3.39s	remaining: 29.2s
30:	learn: 0.0354595	total: 3.48s	remaining: 28.8s
31:	learn: 0.0345687	total: 3.62s	remaining: 28.9s
32:	learn: 0.0336022	total: 3.7s	remaining: 28.6s
33:	learn: 0.0327372	total: 3.79s	remaining: 28.3s
34:	learn: 0.0315512	total: 3.91s	remaining: 28.3s
35:	learn: 0.0304547	total: 4s	remaining: 28s
36:	learn: 0.0295091	total: 4.09s	remaining: 27.7s
37:	learn: 0.0283700	total: 4.2s	remaining: 27.6s
38:	learn: 0.0273326	total: 4.31s	remaining: 27.5s
39:	learn: 0.0268803	total: 4.39s	remaining: 27.2s
40:	learn: 0.0259951	total: 4.51s	remaining: 27.2s
41:	learn: 0.0255051	total: 4.61s	remaining: 27s
42:	learn: 0.0247181	total: 4.7s	remaining: 26.8s
43:	learn: 0.0242862	total: 4.82s	remaining: 26.7s
44:	learn: 0.0236111	total: 4.93s	remaining: 26.6s
45:	learn: 0.0230087	total: 5.01s	remaining: 26.4s
46:	learn: 0.0223091	total: 5.13s	remaining: 26.3s
47:	learn: 0.0217322	total: 5.23s	remaining: 26.2s
48:	learn: 0.0210870	total: 5.35s	remaining: 26.1s
49:	learn: 0.0205532	total: 5.45s	remaining: 25.9s
50:	learn: 0.0201289	total: 5.54s	remaining: 25.7s
51:	learn: 0.0198579	total: 5.65s	remaining: 25.6s
52:	learn: 0.0195433	total: 5.74s	remaining: 25.4s
53:	learn: 0.0188495	total: 5.84s	remaining: 25.3s
54:	learn: 0.0182515	total: 5.95s	remaining: 25.2s
55:	learn: 0.0176269	total: 6.04s	remaining: 25s
56:	learn: 0.0173007	total: 6.13s	remaining: 24.8s
57:	learn: 0.0170277	total: 6.25s	remaining: 24.8s
58:	learn: 0.0167530	total: 6.33s	remaining: 24.6s
59:	learn: 0.0165032	total: 6.42s	remaining: 24.4s
60:	learn: 0.0161914	total: 6.53s	remaining: 24.3s
61:	learn: 0.0155262	total: 6.63s	remaining: 24.1s
62:	learn: 0.0151822	total: 6.71s	remaining: 24s
63:	learn: 0.0147925	total: 6.84s	remaining: 23.9s
64:	learn: 0.0144311	total: 6.93s	remaining: 23.8s
65:	learn: 0.0142233	total: 7.02s	remaining: 23.6s
66:	learn: 0.0139757	total: 7.13s	remaining: 23.5s
67:	learn: 0.0136467	total: 7.21s	remaining: 23.3s
68:	learn: 0.0133651	total: 7.3s	remaining: 23.2s
69:	learn: 0.0129573	total: 7.44s	remaining: 23.2s
70:	learn: 0.0125734	total: 7.53s	remaining: 23s
71:	learn: 0.0121560	total: 7.62s	remaining: 22.9s
72:	learn: 0.0120454	total: 7.73s	remaining: 22.8s
73:	learn: 0.0118606	total: 7.82s	remaining: 22.6s
74:	learn: 0.0116482	total: 7.9s	remaining: 22.4s
75:	learn: 0.0115075	total: 8.04s	remaining: 22.4s
76:	learn: 0.0112851	total: 8.14s	remaining: 22.3s
77:	learn: 0.0109652	total: 8.22s	remaining: 22.1s
78:	learn: 0.0108610	total: 8.33s	remaining: 22s
79:	learn: 0.0106594	total: 8.44s	remaining: 21.9s
80:	learn: 0.0104690	total: 8.53s	remaining: 21.8s
81:	learn: 0.0102706	total: 8.64s	remaining: 21.7s
82:	learn: 0.0102040	total: 8.72s	remaining: 21.5s
83:	learn: 0.0098905	total: 8.82s	remaining: 21.4s
84:	learn: 0.0096306	total: 8.94s	remaining: 21.3s
85:	learn: 0.0094700	total: 9.03s	remaining: 21.2s
86:	learn: 0.0093860	total: 9.12s	remaining: 21.1s
87:	learn: 0.0093201	total: 9.22s	remaining: 21s
88:	learn: 0.0092113	total: 9.3s	remaining: 20.8s
89:	learn: 0.0091577	total: 9.39s	remaining: 20.6s
90:	learn: 0.0089769	total: 9.51s	remaining: 20.6s
91:	learn: 0.0087556	total: 9.6s	remaining: 20.5s
92:	learn: 0.0086360	total: 9.68s	remaining: 20.3s
93:	learn: 0.0085115	total: 9.8s	remaining: 20.2s
94:	learn: 0.0084098	total: 9.89s	remaining: 20.1s
95:	learn: 0.0082887	total: 9.97s	remaining: 19.9s
96:	learn: 0.0081593	total: 10.1s	remaining: 19.9s
97:	learn: 0.0080270	total: 10.2s	remaining: 19.7s
98:	learn: 0.0078793	total: 10.3s	remaining: 19.6s
99:	learn: 0.0078261	total: 10.4s	remaining: 19.5s
100:	learn: 0.0077556	total: 10.4s	remaining: 19.3s
101:	learn: 0.0076465	total: 10.5s	remaining: 19.2s
102:	learn: 0.0075326	total: 10.7s	remaining: 19.1s
103:	learn: 0.0074668	total: 10.7s	remaining: 19s
104:	learn: 0.0072912	total: 10.8s	remaining: 18.9s
105:	learn: 0.0072142	total: 11s	remaining: 18.8s
106:	learn: 0.0071377	total: 11.1s	remaining: 18.8s
107:	learn: 0.0070581	total: 11.3s	remaining: 18.8s
108:	learn: 0.0069649	total: 11.4s	remaining: 18.8s
109:	learn: 0.0068869	total: 11.6s	remaining: 18.8s
110:	learn: 0.0068518	total: 11.8s	remaining: 18.8s
111:	learn: 0.0067557	total: 11.9s	remaining: 18.8s
112:	learn: 0.0066595	total: 12.1s	remaining: 18.8s
113:	learn: 0.0065780	total: 12.3s	remaining: 18.7s
114:	learn: 0.0064651	total: 12.5s	remaining: 18.7s
115:	learn: 0.0063203	total: 12.6s	remaining: 18.7s
116:	learn: 0.0062394	total: 12.8s	remaining: 18.7s
117:	learn: 0.0061672	total: 13s	remaining: 18.7s
118:	learn: 0.0061228	total: 13.1s	remaining: 18.7s
119:	learn: 0.0060823	total: 13.3s	remaining: 18.6s
120:	learn: 0.0060236	total: 13.5s	remaining: 18.6s
121:	learn: 0.0059385	total: 13.6s	remaining: 18.6s
122:	learn: 0.0058521	total: 13.8s	remaining: 18.5s
123:	learn: 0.0057187	total: 14s	remaining: 18.5s
124:	learn: 0.0056189	total: 14.2s	remaining: 18.5s
125:	learn: 0.0055619	total: 14.3s	remaining: 18.4s
126:	learn: 0.0054487	total: 14.5s	remaining: 18.4s
127:	learn: 0.0053887	total: 14.7s	remaining: 18.4s
128:	learn: 0.0053462	total: 14.8s	remaining: 18.3s
129:	learn: 0.0052999	total: 15s	remaining: 18.2s
130:	learn: 0.0052173	total: 15.1s	remaining: 18.2s
131:	learn: 0.0051493	total: 15.4s	remaining: 18.1s
132:	learn: 0.0051160	total: 15.5s	remaining: 18.1s
133:	learn: 0.0050764	total: 15.7s	remaining: 18s
134:	learn: 0.0050525	total: 15.9s	remaining: 18s
135:	learn: 0.0049973	total: 16s	remaining: 17.9s
136:	learn: 0.0049430	total: 16.2s	remaining: 17.9s
137:	learn: 0.0049104	total: 16.4s	remaining: 17.8s
138:	learn: 0.0048797	total: 16.5s	remaining: 17.7s
139:	learn: 0.0048334	total: 16.7s	remaining: 17.7s
140:	learn: 0.0047849	total: 16.8s	remaining: 17.5s
141:	learn: 0.0047206	total: 16.9s	remaining: 17.4s
142:	learn: 0.0046573	total: 17s	remaining: 17.3s
143:	learn: 0.0046235	total: 17.1s	remaining: 17.1s
144:	learn: 0.0045646	total: 17.3s	remaining: 17s
145:	learn: 0.0045246	total: 17.4s	remaining: 16.9s
146:	learn: 0.0044539	total: 17.4s	remaining: 16.7s
147:	learn: 0.0043592	total: 17.6s	remaining: 16.6s
148:	learn: 0.0043102	total: 17.6s	remaining: 16.5s
149:	learn: 0.0042987	total: 17.7s	remaining: 16.3s
150:	learn: 0.0042397	total: 17.9s	remaining: 16.2s
151:	learn: 0.0042031	total: 17.9s	remaining: 16.1s
152:	learn: 0.0041928	total: 18s	remaining: 15.9s
153:	learn: 0.0041660	total: 18.1s	remaining: 15.8s
154:	learn: 0.0041225	total: 18.2s	remaining: 15.6s
155:	learn: 0.0041198	total: 18.3s	remaining: 15.5s
156:	learn: 0.0040988	total: 18.4s	remaining: 15.3s
157:	learn: 0.0040171	total: 18.5s	remaining: 15.2s
158:	learn: 0.0039963	total: 18.6s	remaining: 15.1s
159:	learn: 0.0039963	total: 18.6s	remaining: 14.9s
160:	learn: 0.0039430	total: 18.7s	remaining: 14.8s
161:	learn: 0.0039232	total: 18.8s	remaining: 14.6s
162:	learn: 0.0038948	total: 18.9s	remaining: 14.5s
163:	learn: 0.0038948	total: 19s	remaining: 14.4s
164:	learn: 0.0038747	total: 19.1s	remaining: 14.2s
165:	learn: 0.0038525	total: 19.2s	remaining: 14.1s
166:	learn: 0.0038252	total: 19.3s	remaining: 14s
167:	learn: 0.0037797	total: 19.4s	remaining: 13.8s
168:	learn: 0.0037704	total: 19.5s	remaining: 13.7s
169:	learn: 0.0037539	total: 19.5s	remaining: 13.6s
170:	learn: 0.0037539	total: 19.6s	remaining: 13.4s
171:	learn: 0.0037407	total: 19.7s	remaining: 13.3s
172:	learn: 0.0036933	total: 19.8s	remaining: 13.2s
173:	learn: 0.0036933	total: 19.9s	remaining: 13s
174:	learn: 0.0036388	total: 20s	remaining: 12.9s
175:	learn: 0.0035896	total: 20.1s	remaining: 12.8s
176:	learn: 0.0035710	total: 20.2s	remaining: 12.7s
177:	learn: 0.0035281	total: 20.3s	remaining: 12.5s
178:	learn: 0.0035152	total: 20.4s	remaining: 12.4s
179:	learn: 0.0034965	total: 20.5s	remaining: 12.3s
180:	learn: 0.0034786	total: 20.6s	remaining: 12.2s
181:	learn: 0.0034699	total: 20.7s	remaining: 12s
182:	learn: 0.0034699	total: 20.7s	remaining: 11.9s
183:	learn: 0.0034699	total: 20.8s	remaining: 11.8s
184:	learn: 0.0034488	total: 20.9s	remaining: 11.6s
185:	learn: 0.0034488	total: 21s	remaining: 11.5s
186:	learn: 0.0034245	total: 21.1s	remaining: 11.4s
187:	learn: 0.0033898	total: 21.2s	remaining: 11.3s
188:	learn: 0.0033898	total: 21.3s	remaining: 11.1s
189:	learn: 0.0033898	total: 21.4s	remaining: 11s
190:	learn: 0.0033898	total: 21.4s	remaining: 10.9s
191:	learn: 0.0033898	total: 21.5s	remaining: 10.8s
192:	learn: 0.0033898	total: 21.6s	remaining: 10.6s
193:	learn: 0.0033898	total: 21.7s	remaining: 10.5s
194:	learn: 0.0033898	total: 21.8s	remaining: 10.4s
195:	learn: 0.0033898	total: 21.9s	remaining: 10.3s
196:	learn: 0.0033898	total: 22s	remaining: 10.1s
197:	learn: 0.0033898	total: 22s	remaining: 10s
198:	learn: 0.0033898	total: 22.1s	remaining: 9.9s
199:	learn: 0.0033898	total: 22.2s	remaining: 9.77s
200:	learn: 0.0033898	total: 22.3s	remaining: 9.64s
201:	learn: 0.0033898	total: 22.4s	remaining: 9.52s
202:	learn: 0.0033898	total: 22.4s	remaining: 9.4s
203:	learn: 0.0033898	total: 22.5s	remaining: 9.27s
204:	learn: 0.0033898	total: 22.6s	remaining: 9.14s
205:	learn: 0.0033897	total: 22.7s	remaining: 9.03s
206:	learn: 0.0033897	total: 22.8s	remaining: 8.9s
207:	learn: 0.0033897	total: 22.9s	remaining: 8.79s
208:	learn: 0.0033897	total: 22.9s	remaining: 8.67s
209:	learn: 0.0033897	total: 23s	remaining: 8.55s
210:	learn: 0.0033897	total: 23.1s	remaining: 8.43s
211:	learn: 0.0033897	total: 23.2s	remaining: 8.31s
212:	learn: 0.0033897	total: 23.2s	remaining: 8.19s
213:	learn: 0.0033897	total: 23.3s	remaining: 8.07s
214:	learn: 0.0033897	total: 23.4s	remaining: 7.95s
215:	learn: 0.0033897	total: 23.5s	remaining: 7.83s
216:	learn: 0.0033897	total: 23.6s	remaining: 7.71s
217:	learn: 0.0033897	total: 23.6s	remaining: 7.59s
218:	learn: 0.0033897	total: 23.7s	remaining: 7.47s
219:	learn: 0.0033897	total: 23.8s	remaining: 7.36s
220:	learn: 0.0033897	total: 23.9s	remaining: 7.24s
221:	learn: 0.0033897	total: 24s	remaining: 7.12s
222:	learn: 0.0033897	total: 24.1s	remaining: 7.01s
223:	learn: 0.0033897	total: 24.1s	remaining: 6.9s
224:	learn: 0.0033897	total: 24.2s	remaining: 6.78s
225:	learn: 0.0033897	total: 24.3s	remaining: 6.67s
226:	learn: 0.0033897	total: 24.4s	remaining: 6.55s
227:	learn: 0.0033897	total: 24.5s	remaining: 6.43s
228:	learn: 0.0033897	total: 24.5s	remaining: 6.32s
229:	learn: 0.0033897	total: 24.6s	remaining: 6.21s
230:	learn: 0.0033897	total: 24.7s	remaining: 6.09s
231:	learn: 0.0033896	total: 24.8s	remaining: 5.98s
232:	learn: 0.0033896	total: 24.9s	remaining: 5.87s
233:	learn: 0.0033897	total: 24.9s	remaining: 5.75s
234:	learn: 0.0033896	total: 25s	remaining: 5.64s
235:	learn: 0.0033896	total: 25.1s	remaining: 5.53s
236:	learn: 0.0033896	total: 25.2s	remaining: 5.42s
237:	learn: 0.0033896	total: 25.3s	remaining: 5.31s
238:	learn: 0.0033897	total: 25.4s	remaining: 5.2s
239:	learn: 0.0033896	total: 25.4s	remaining: 5.08s
240:	learn: 0.0033896	total: 25.5s	remaining: 4.98s
241:	learn: 0.0033896	total: 25.6s	remaining: 4.87s
242:	learn: 0.0033896	total: 25.7s	remaining: 4.76s
243:	learn: 0.0033896	total: 25.8s	remaining: 4.65s
244:	learn: 0.0033896	total: 25.9s	remaining: 4.54s
245:	learn: 0.0033896	total: 25.9s	remaining: 4.43s
246:	learn: 0.0033896	total: 26s	remaining: 4.32s
247:	learn: 0.0033896	total: 26.1s	remaining: 4.21s
248:	learn: 0.0033896	total: 26.2s	remaining: 4.1s
249:	learn: 0.0033896	total: 26.3s	remaining: 4s
250:	learn: 0.0033896	total: 26.4s	remaining: 3.88s
251:	learn: 0.0033896	total: 26.4s	remaining: 3.77s
252:	learn: 0.0033896	total: 26.5s	remaining: 3.66s
253:	learn: 0.0033896	total: 26.6s	remaining: 3.56s
254:	learn: 0.0033896	total: 26.6s	remaining: 3.45s
255:	learn: 0.0033896	total: 26.7s	remaining: 3.34s
256:	learn: 0.0033896	total: 26.9s	remaining: 3.24s
257:	learn: 0.0033753	total: 27s	remaining: 3.14s
258:	learn: 0.0033516	total: 27.2s	remaining: 3.04s
259:	learn: 0.0033065	total: 27.3s	remaining: 2.94s
260:	learn: 0.0033065	total: 27.5s	remaining: 2.84s
261:	learn: 0.0033065	total: 27.6s	remaining: 2.74s
262:	learn: 0.0033065	total: 27.7s	remaining: 2.64s
263:	learn: 0.0033065	total: 27.9s	remaining: 2.53s
264:	learn: 0.0033065	total: 28s	remaining: 2.43s
265:	learn: 0.0033065	total: 28.1s	remaining: 2.32s
266:	learn: 0.0033065	total: 28.3s	remaining: 2.22s
267:	learn: 0.0033065	total: 28.4s	remaining: 2.12s
268:	learn: 0.0033065	total: 28.5s	remaining: 2.01s
269:	learn: 0.0033065	total: 28.6s	remaining: 1.91s
270:	learn: 0.0033064	total: 28.8s	remaining: 1.8s
271:	learn: 0.0033064	total: 28.9s	remaining: 1.7s
272:	learn: 0.0033064	total: 29s	remaining: 1.59s
273:	learn: 0.0033064	total: 29.2s	remaining: 1.49s
274:	learn: 0.0033064	total: 29.3s	remaining: 1.38s
275:	learn: 0.0033064	total: 29.4s	remaining: 1.28s
276:	learn: 0.0033064	total: 29.6s	remaining: 1.17s
277:	learn: 0.0033064	total: 29.7s	remaining: 1.07s
278:	learn: 0.0033064	total: 29.8s	remaining: 963ms
279:	learn: 0.0033064	total: 30s	remaining: 857ms
280:	learn: 0.0033064	total: 30.1s	remaining: 750ms
281:	learn: 0.0033064	total: 30.2s	remaining: 643ms
282:	learn: 0.0033064	total: 30.4s	remaining: 537ms
283:	learn: 0.0033064	total: 30.5s	remaining: 430ms
284:	learn: 0.0033064	total: 30.7s	remaining: 323ms
285:	learn: 0.0033064	total: 30.8s	remaining: 215ms
286:	learn: 0.0033064	total: 30.9s	remaining: 108ms
287:	learn: 0.0033064	total: 31s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.86
 - Precision_Test: 30.00
 - Recall_Test: 88.10
 - AUPRC_Test: 78.63
 - Accuracy_Test: 99.63
 - F1-Score_Test: 44.76
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 288
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5087438	total: 85.3ms	remaining: 24.5s
1:	learn: 0.3643266	total: 176ms	remaining: 25.2s
2:	learn: 0.2598280	total: 265ms	remaining: 25.2s
3:	learn: 0.2000485	total: 378ms	remaining: 26.8s
4:	learn: 0.1673378	total: 460ms	remaining: 26s
5:	learn: 0.1434978	total: 544ms	remaining: 25.6s
6:	learn: 0.1199171	total: 676ms	remaining: 27.2s
7:	learn: 0.1092830	total: 758ms	remaining: 26.5s
8:	learn: 0.0993275	total: 845ms	remaining: 26.2s
9:	learn: 0.0913083	total: 962ms	remaining: 26.7s
10:	learn: 0.0832556	total: 1.06s	remaining: 26.6s
11:	learn: 0.0783417	total: 1.14s	remaining: 26.2s
12:	learn: 0.0726271	total: 1.27s	remaining: 26.8s
13:	learn: 0.0688604	total: 1.35s	remaining: 26.4s
14:	learn: 0.0652080	total: 1.44s	remaining: 26.2s
15:	learn: 0.0626218	total: 1.55s	remaining: 26.3s
16:	learn: 0.0572874	total: 1.64s	remaining: 26.1s
17:	learn: 0.0549295	total: 1.73s	remaining: 26s
18:	learn: 0.0522404	total: 1.88s	remaining: 26.6s
19:	learn: 0.0491168	total: 1.97s	remaining: 26.5s
20:	learn: 0.0461868	total: 2.07s	remaining: 26.3s
21:	learn: 0.0440953	total: 2.19s	remaining: 26.5s
22:	learn: 0.0426862	total: 2.27s	remaining: 26.1s
23:	learn: 0.0408247	total: 2.36s	remaining: 25.9s
24:	learn: 0.0393365	total: 2.47s	remaining: 26s
25:	learn: 0.0373229	total: 2.56s	remaining: 25.8s
26:	learn: 0.0365518	total: 2.63s	remaining: 25.4s
27:	learn: 0.0349059	total: 2.76s	remaining: 25.6s
28:	learn: 0.0333411	total: 2.86s	remaining: 25.5s
29:	learn: 0.0318223	total: 2.95s	remaining: 25.4s
30:	learn: 0.0309082	total: 3.06s	remaining: 25.4s
31:	learn: 0.0302304	total: 3.15s	remaining: 25.2s
32:	learn: 0.0291110	total: 3.24s	remaining: 25s
33:	learn: 0.0285494	total: 3.35s	remaining: 25s
34:	learn: 0.0275591	total: 3.44s	remaining: 24.9s
35:	learn: 0.0265204	total: 3.54s	remaining: 24.8s
36:	learn: 0.0258592	total: 3.66s	remaining: 24.8s
37:	learn: 0.0253896	total: 3.74s	remaining: 24.6s
38:	learn: 0.0245512	total: 3.84s	remaining: 24.5s
39:	learn: 0.0240854	total: 3.95s	remaining: 24.5s
40:	learn: 0.0236957	total: 4.03s	remaining: 24.3s
41:	learn: 0.0229385	total: 4.12s	remaining: 24.2s
42:	learn: 0.0223313	total: 4.24s	remaining: 24.1s
43:	learn: 0.0218281	total: 4.32s	remaining: 24s
44:	learn: 0.0212302	total: 4.41s	remaining: 23.8s
45:	learn: 0.0206580	total: 4.52s	remaining: 23.8s
46:	learn: 0.0199263	total: 4.61s	remaining: 23.6s
47:	learn: 0.0195544	total: 4.7s	remaining: 23.5s
48:	learn: 0.0190546	total: 4.84s	remaining: 23.6s
49:	learn: 0.0185193	total: 4.93s	remaining: 23.5s
50:	learn: 0.0182308	total: 5.01s	remaining: 23.3s
51:	learn: 0.0179423	total: 5.12s	remaining: 23.2s
52:	learn: 0.0173276	total: 5.21s	remaining: 23.1s
53:	learn: 0.0168788	total: 5.31s	remaining: 23s
54:	learn: 0.0165093	total: 5.42s	remaining: 22.9s
55:	learn: 0.0163170	total: 5.5s	remaining: 22.8s
56:	learn: 0.0157844	total: 5.59s	remaining: 22.7s
57:	learn: 0.0153239	total: 5.71s	remaining: 22.6s
58:	learn: 0.0150969	total: 5.81s	remaining: 22.5s
59:	learn: 0.0147857	total: 5.9s	remaining: 22.4s
60:	learn: 0.0143550	total: 6.02s	remaining: 22.4s
61:	learn: 0.0140382	total: 6.12s	remaining: 22.3s
62:	learn: 0.0135976	total: 6.21s	remaining: 22.2s
63:	learn: 0.0132375	total: 6.32s	remaining: 22.1s
64:	learn: 0.0128776	total: 6.41s	remaining: 22s
65:	learn: 0.0126679	total: 6.49s	remaining: 21.8s
66:	learn: 0.0125492	total: 6.63s	remaining: 21.9s
67:	learn: 0.0121199	total: 6.73s	remaining: 21.8s
68:	learn: 0.0118933	total: 6.82s	remaining: 21.6s
69:	learn: 0.0116155	total: 6.94s	remaining: 21.6s
70:	learn: 0.0113726	total: 7.03s	remaining: 21.5s
71:	learn: 0.0111523	total: 7.12s	remaining: 21.4s
72:	learn: 0.0109277	total: 7.24s	remaining: 21.3s
73:	learn: 0.0107302	total: 7.33s	remaining: 21.2s
74:	learn: 0.0106245	total: 7.42s	remaining: 21.1s
75:	learn: 0.0104953	total: 7.53s	remaining: 21s
76:	learn: 0.0102315	total: 7.62s	remaining: 20.9s
77:	learn: 0.0101483	total: 7.7s	remaining: 20.7s
78:	learn: 0.0099834	total: 7.81s	remaining: 20.7s
79:	learn: 0.0096960	total: 7.92s	remaining: 20.6s
80:	learn: 0.0094098	total: 8.01s	remaining: 20.5s
81:	learn: 0.0091732	total: 8.13s	remaining: 20.4s
82:	learn: 0.0090629	total: 8.21s	remaining: 20.3s
83:	learn: 0.0088578	total: 8.3s	remaining: 20.2s
84:	learn: 0.0087035	total: 8.41s	remaining: 20.1s
85:	learn: 0.0085553	total: 8.5s	remaining: 20s
86:	learn: 0.0084413	total: 8.59s	remaining: 19.8s
87:	learn: 0.0082629	total: 8.7s	remaining: 19.8s
88:	learn: 0.0081257	total: 8.79s	remaining: 19.7s
89:	learn: 0.0080430	total: 8.89s	remaining: 19.6s
90:	learn: 0.0078960	total: 9.07s	remaining: 19.6s
91:	learn: 0.0078102	total: 9.21s	remaining: 19.6s
92:	learn: 0.0076388	total: 9.37s	remaining: 19.6s
93:	learn: 0.0075383	total: 9.53s	remaining: 19.7s
94:	learn: 0.0074658	total: 9.69s	remaining: 19.7s
95:	learn: 0.0073747	total: 9.86s	remaining: 19.7s
96:	learn: 0.0072851	total: 10.1s	remaining: 19.8s
97:	learn: 0.0072015	total: 10.2s	remaining: 19.9s
98:	learn: 0.0071110	total: 10.4s	remaining: 19.9s
99:	learn: 0.0070295	total: 10.6s	remaining: 19.9s
100:	learn: 0.0069410	total: 10.7s	remaining: 19.9s
101:	learn: 0.0067787	total: 10.9s	remaining: 19.9s
102:	learn: 0.0066462	total: 11.1s	remaining: 20s
103:	learn: 0.0065185	total: 11.3s	remaining: 20s
104:	learn: 0.0063774	total: 11.5s	remaining: 20s
105:	learn: 0.0062748	total: 11.7s	remaining: 20s
106:	learn: 0.0061765	total: 11.8s	remaining: 20s
107:	learn: 0.0061456	total: 12s	remaining: 20s
108:	learn: 0.0060571	total: 12.2s	remaining: 20s
109:	learn: 0.0060305	total: 12.3s	remaining: 19.9s
110:	learn: 0.0059676	total: 12.4s	remaining: 19.9s
111:	learn: 0.0058323	total: 12.6s	remaining: 19.8s
112:	learn: 0.0058058	total: 12.8s	remaining: 19.8s
113:	learn: 0.0056856	total: 13s	remaining: 19.8s
114:	learn: 0.0056216	total: 13.1s	remaining: 19.8s
115:	learn: 0.0054862	total: 13.3s	remaining: 19.7s
116:	learn: 0.0054402	total: 13.5s	remaining: 19.7s
117:	learn: 0.0053426	total: 13.7s	remaining: 19.7s
118:	learn: 0.0052975	total: 13.8s	remaining: 19.6s
119:	learn: 0.0052180	total: 14s	remaining: 19.6s
120:	learn: 0.0051547	total: 14.2s	remaining: 19.6s
121:	learn: 0.0050654	total: 14.3s	remaining: 19.5s
122:	learn: 0.0049720	total: 14.4s	remaining: 19.4s
123:	learn: 0.0049173	total: 14.5s	remaining: 19.2s
124:	learn: 0.0048852	total: 14.6s	remaining: 19s
125:	learn: 0.0048852	total: 14.7s	remaining: 18.9s
126:	learn: 0.0047960	total: 14.8s	remaining: 18.8s
127:	learn: 0.0046853	total: 14.9s	remaining: 18.6s
128:	learn: 0.0046383	total: 15s	remaining: 18.5s
129:	learn: 0.0046099	total: 15.1s	remaining: 18.3s
130:	learn: 0.0045813	total: 15.2s	remaining: 18.2s
131:	learn: 0.0045374	total: 15.3s	remaining: 18s
132:	learn: 0.0045120	total: 15.4s	remaining: 17.9s
133:	learn: 0.0044367	total: 15.4s	remaining: 17.7s
134:	learn: 0.0044170	total: 15.5s	remaining: 17.6s
135:	learn: 0.0043830	total: 15.6s	remaining: 17.5s
136:	learn: 0.0043641	total: 15.7s	remaining: 17.3s
137:	learn: 0.0043105	total: 15.8s	remaining: 17.2s
138:	learn: 0.0042992	total: 15.9s	remaining: 17s
139:	learn: 0.0042991	total: 16s	remaining: 16.9s
140:	learn: 0.0042991	total: 16s	remaining: 16.7s
141:	learn: 0.0042674	total: 16.2s	remaining: 16.6s
142:	learn: 0.0042047	total: 16.3s	remaining: 16.5s
143:	learn: 0.0041614	total: 16.4s	remaining: 16.4s
144:	learn: 0.0041343	total: 16.5s	remaining: 16.2s
145:	learn: 0.0041053	total: 16.6s	remaining: 16.1s
146:	learn: 0.0040639	total: 16.6s	remaining: 16s
147:	learn: 0.0040244	total: 16.8s	remaining: 15.8s
148:	learn: 0.0039862	total: 16.8s	remaining: 15.7s
149:	learn: 0.0039557	total: 16.9s	remaining: 15.6s
150:	learn: 0.0039128	total: 17s	remaining: 15.4s
151:	learn: 0.0038973	total: 17.1s	remaining: 15.3s
152:	learn: 0.0038571	total: 17.2s	remaining: 15.2s
153:	learn: 0.0037915	total: 17.3s	remaining: 15.1s
154:	learn: 0.0037790	total: 17.4s	remaining: 15s
155:	learn: 0.0037520	total: 17.5s	remaining: 14.8s
156:	learn: 0.0036917	total: 17.6s	remaining: 14.7s
157:	learn: 0.0036648	total: 17.7s	remaining: 14.6s
158:	learn: 0.0036648	total: 17.8s	remaining: 14.4s
159:	learn: 0.0036221	total: 17.9s	remaining: 14.3s
160:	learn: 0.0035922	total: 18s	remaining: 14.2s
161:	learn: 0.0035773	total: 18.1s	remaining: 14s
162:	learn: 0.0035115	total: 18.2s	remaining: 13.9s
163:	learn: 0.0034923	total: 18.3s	remaining: 13.8s
164:	learn: 0.0034391	total: 18.4s	remaining: 13.7s
165:	learn: 0.0034390	total: 18.5s	remaining: 13.6s
166:	learn: 0.0034390	total: 18.5s	remaining: 13.4s
167:	learn: 0.0034037	total: 18.6s	remaining: 13.3s
168:	learn: 0.0033780	total: 18.7s	remaining: 13.2s
169:	learn: 0.0033515	total: 18.8s	remaining: 13.1s
170:	learn: 0.0033414	total: 18.9s	remaining: 12.9s
171:	learn: 0.0033414	total: 19s	remaining: 12.8s
172:	learn: 0.0033118	total: 19.1s	remaining: 12.7s
173:	learn: 0.0032741	total: 19.2s	remaining: 12.6s
174:	learn: 0.0032741	total: 19.3s	remaining: 12.4s
175:	learn: 0.0032740	total: 19.3s	remaining: 12.3s
176:	learn: 0.0032740	total: 19.4s	remaining: 12.2s
177:	learn: 0.0032740	total: 19.5s	remaining: 12s
178:	learn: 0.0032435	total: 19.6s	remaining: 11.9s
179:	learn: 0.0032435	total: 19.7s	remaining: 11.8s
180:	learn: 0.0032143	total: 19.8s	remaining: 11.7s
181:	learn: 0.0031974	total: 19.9s	remaining: 11.6s
182:	learn: 0.0031772	total: 19.9s	remaining: 11.4s
183:	learn: 0.0031772	total: 20s	remaining: 11.3s
184:	learn: 0.0031772	total: 20.1s	remaining: 11.2s
185:	learn: 0.0031772	total: 20.2s	remaining: 11.1s
186:	learn: 0.0031772	total: 20.2s	remaining: 10.9s
187:	learn: 0.0031772	total: 20.4s	remaining: 10.8s
188:	learn: 0.0031772	total: 20.4s	remaining: 10.7s
189:	learn: 0.0031771	total: 20.5s	remaining: 10.6s
190:	learn: 0.0031771	total: 20.6s	remaining: 10.5s
191:	learn: 0.0031771	total: 20.7s	remaining: 10.3s
192:	learn: 0.0031771	total: 20.7s	remaining: 10.2s
193:	learn: 0.0031771	total: 20.8s	remaining: 10.1s
194:	learn: 0.0031771	total: 20.9s	remaining: 9.96s
195:	learn: 0.0031771	total: 21s	remaining: 9.83s
196:	learn: 0.0031771	total: 21s	remaining: 9.71s
197:	learn: 0.0031771	total: 21.1s	remaining: 9.6s
198:	learn: 0.0031771	total: 21.2s	remaining: 9.47s
199:	learn: 0.0031771	total: 21.2s	remaining: 9.35s
200:	learn: 0.0031771	total: 21.3s	remaining: 9.22s
201:	learn: 0.0031770	total: 21.4s	remaining: 9.11s
202:	learn: 0.0031770	total: 21.5s	remaining: 8.98s
203:	learn: 0.0031770	total: 21.5s	remaining: 8.87s
204:	learn: 0.0031770	total: 21.6s	remaining: 8.75s
205:	learn: 0.0031770	total: 21.7s	remaining: 8.63s
206:	learn: 0.0031770	total: 21.8s	remaining: 8.51s
207:	learn: 0.0031769	total: 21.8s	remaining: 8.4s
208:	learn: 0.0031769	total: 21.9s	remaining: 8.28s
209:	learn: 0.0031769	total: 22s	remaining: 8.16s
210:	learn: 0.0031769	total: 22s	remaining: 8.04s
211:	learn: 0.0031769	total: 22.1s	remaining: 7.93s
212:	learn: 0.0031769	total: 22.2s	remaining: 7.82s
213:	learn: 0.0031769	total: 22.3s	remaining: 7.7s
214:	learn: 0.0031769	total: 22.3s	remaining: 7.58s
215:	learn: 0.0031769	total: 22.4s	remaining: 7.47s
216:	learn: 0.0031769	total: 22.5s	remaining: 7.35s
217:	learn: 0.0031769	total: 22.5s	remaining: 7.23s
218:	learn: 0.0031768	total: 22.6s	remaining: 7.12s
219:	learn: 0.0031768	total: 22.7s	remaining: 7.01s
220:	learn: 0.0031768	total: 22.8s	remaining: 6.9s
221:	learn: 0.0031768	total: 22.8s	remaining: 6.78s
222:	learn: 0.0031768	total: 22.9s	remaining: 6.67s
223:	learn: 0.0031768	total: 23s	remaining: 6.56s
224:	learn: 0.0031768	total: 23s	remaining: 6.45s
225:	learn: 0.0031768	total: 23.1s	remaining: 6.33s
226:	learn: 0.0031768	total: 23.1s	remaining: 6.22s
227:	learn: 0.0031768	total: 23.2s	remaining: 6.11s
228:	learn: 0.0031768	total: 23.3s	remaining: 6s
229:	learn: 0.0031768	total: 23.4s	remaining: 5.89s
230:	learn: 0.0031767	total: 23.4s	remaining: 5.78s
231:	learn: 0.0031767	total: 23.5s	remaining: 5.67s
232:	learn: 0.0031767	total: 23.6s	remaining: 5.56s
233:	learn: 0.0031767	total: 23.6s	remaining: 5.45s
234:	learn: 0.0031767	total: 23.7s	remaining: 5.35s
235:	learn: 0.0031767	total: 23.8s	remaining: 5.24s
236:	learn: 0.0031766	total: 23.9s	remaining: 5.14s
237:	learn: 0.0031766	total: 23.9s	remaining: 5.03s
238:	learn: 0.0031766	total: 24s	remaining: 4.92s
239:	learn: 0.0031766	total: 24.1s	remaining: 4.81s
240:	learn: 0.0031766	total: 24.2s	remaining: 4.71s
241:	learn: 0.0031766	total: 24.3s	remaining: 4.61s
242:	learn: 0.0031766	total: 24.4s	remaining: 4.51s
243:	learn: 0.0031766	total: 24.5s	remaining: 4.42s
244:	learn: 0.0031766	total: 24.6s	remaining: 4.32s
245:	learn: 0.0031766	total: 24.7s	remaining: 4.22s
246:	learn: 0.0031766	total: 24.8s	remaining: 4.12s
247:	learn: 0.0031765	total: 25s	remaining: 4.03s
248:	learn: 0.0031765	total: 25.1s	remaining: 3.93s
249:	learn: 0.0031765	total: 25.2s	remaining: 3.83s
250:	learn: 0.0031765	total: 25.3s	remaining: 3.73s
251:	learn: 0.0031765	total: 25.4s	remaining: 3.63s
252:	learn: 0.0031765	total: 25.6s	remaining: 3.54s
253:	learn: 0.0031765	total: 25.7s	remaining: 3.44s
254:	learn: 0.0031765	total: 25.8s	remaining: 3.34s
255:	learn: 0.0031765	total: 25.9s	remaining: 3.24s
256:	learn: 0.0031765	total: 26.1s	remaining: 3.14s
257:	learn: 0.0031765	total: 26.2s	remaining: 3.04s
258:	learn: 0.0031765	total: 26.3s	remaining: 2.94s
259:	learn: 0.0031765	total: 26.4s	remaining: 2.85s
260:	learn: 0.0031764	total: 26.6s	remaining: 2.75s
261:	learn: 0.0031764	total: 26.7s	remaining: 2.65s
262:	learn: 0.0031764	total: 26.8s	remaining: 2.55s
263:	learn: 0.0031555	total: 27s	remaining: 2.45s
264:	learn: 0.0031555	total: 27.1s	remaining: 2.35s
265:	learn: 0.0031555	total: 27.2s	remaining: 2.25s
266:	learn: 0.0031555	total: 27.4s	remaining: 2.15s
267:	learn: 0.0031555	total: 27.5s	remaining: 2.05s
268:	learn: 0.0031555	total: 27.6s	remaining: 1.95s
269:	learn: 0.0031555	total: 27.7s	remaining: 1.85s
270:	learn: 0.0031555	total: 27.9s	remaining: 1.75s
271:	learn: 0.0031555	total: 28s	remaining: 1.65s
272:	learn: 0.0031555	total: 28.2s	remaining: 1.55s
273:	learn: 0.0031555	total: 28.3s	remaining: 1.44s
274:	learn: 0.0031555	total: 28.4s	remaining: 1.34s
275:	learn: 0.0031555	total: 28.5s	remaining: 1.24s
276:	learn: 0.0031555	total: 28.7s	remaining: 1.14s
277:	learn: 0.0031555	total: 28.8s	remaining: 1.04s
278:	learn: 0.0031555	total: 28.9s	remaining: 933ms
279:	learn: 0.0031555	total: 29.1s	remaining: 830ms
280:	learn: 0.0031555	total: 29.2s	remaining: 727ms
281:	learn: 0.0031555	total: 29.3s	remaining: 624ms
282:	learn: 0.0031555	total: 29.4s	remaining: 520ms
283:	learn: 0.0031555	total: 29.6s	remaining: 417ms
284:	learn: 0.0031555	total: 29.7s	remaining: 313ms
285:	learn: 0.0031555	total: 29.9s	remaining: 209ms
286:	learn: 0.0031555	total: 30s	remaining: 105ms
287:	learn: 0.0031555	total: 30.2s	remaining: 0us
[I 2024-12-19 15:05:43,154] Trial 43 finished with value: 79.65904428453553 and parameters: {'learning_rate': 0.08594556429165298, 'max_depth': 6, 'n_estimators': 288, 'scale_pos_weight': 5.469096902172981}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.74
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.87
 - F1-Score_Train: 99.87
 - Precision_Test: 30.36
 - Recall_Test: 86.51
 - AUPRC_Test: 80.17
 - Accuracy_Test: 99.64
 - F1-Score_Test: 44.95
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 288
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.47
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 79.6590

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5239965	total: 90.3ms	remaining: 25.2s
1:	learn: 0.3787417	total: 180ms	remaining: 25s
2:	learn: 0.2722155	total: 277ms	remaining: 25.6s
3:	learn: 0.2044810	total: 399ms	remaining: 27.6s
4:	learn: 0.1625860	total: 493ms	remaining: 27.1s
5:	learn: 0.1304929	total: 596ms	remaining: 27.2s
6:	learn: 0.1113307	total: 718ms	remaining: 28s
7:	learn: 0.0989650	total: 810ms	remaining: 27.5s
8:	learn: 0.0900454	total: 901ms	remaining: 27.1s
9:	learn: 0.0814648	total: 1.02s	remaining: 27.5s
10:	learn: 0.0757353	total: 1.1s	remaining: 26.9s
11:	learn: 0.0702355	total: 1.19s	remaining: 26.6s
12:	learn: 0.0628756	total: 1.34s	remaining: 27.6s
13:	learn: 0.0574686	total: 1.44s	remaining: 27.3s
14:	learn: 0.0524324	total: 1.53s	remaining: 27s
15:	learn: 0.0486328	total: 1.66s	remaining: 27.4s
16:	learn: 0.0470580	total: 1.75s	remaining: 27s
17:	learn: 0.0448128	total: 1.84s	remaining: 26.8s
18:	learn: 0.0425735	total: 1.96s	remaining: 26.9s
19:	learn: 0.0402776	total: 2.04s	remaining: 26.6s
20:	learn: 0.0386572	total: 2.13s	remaining: 26.2s
21:	learn: 0.0375057	total: 2.25s	remaining: 26.3s
22:	learn: 0.0360842	total: 2.33s	remaining: 26.1s
23:	learn: 0.0348120	total: 2.42s	remaining: 25.8s
24:	learn: 0.0332869	total: 2.53s	remaining: 25.8s
25:	learn: 0.0321928	total: 2.62s	remaining: 25.6s
26:	learn: 0.0310997	total: 2.71s	remaining: 25.4s
27:	learn: 0.0299601	total: 2.83s	remaining: 25.4s
28:	learn: 0.0287254	total: 2.91s	remaining: 25.2s
29:	learn: 0.0276635	total: 3s	remaining: 25s
30:	learn: 0.0265898	total: 3.12s	remaining: 25s
31:	learn: 0.0257868	total: 3.2s	remaining: 24.8s
32:	learn: 0.0250155	total: 3.3s	remaining: 24.7s
33:	learn: 0.0244939	total: 3.42s	remaining: 24.7s
34:	learn: 0.0238363	total: 3.5s	remaining: 24.5s
35:	learn: 0.0228945	total: 3.59s	remaining: 24.3s
36:	learn: 0.0222427	total: 3.72s	remaining: 24.4s
37:	learn: 0.0216378	total: 3.81s	remaining: 24.3s
38:	learn: 0.0210416	total: 3.9s	remaining: 24.1s
39:	learn: 0.0205401	total: 4.01s	remaining: 24.1s
40:	learn: 0.0200665	total: 4.09s	remaining: 23.8s
41:	learn: 0.0194250	total: 4.18s	remaining: 23.7s
42:	learn: 0.0189363	total: 4.29s	remaining: 23.6s
43:	learn: 0.0183219	total: 4.37s	remaining: 23.5s
44:	learn: 0.0178007	total: 4.46s	remaining: 23.3s
45:	learn: 0.0174688	total: 4.58s	remaining: 23.3s
46:	learn: 0.0170678	total: 4.66s	remaining: 23.1s
47:	learn: 0.0166535	total: 4.76s	remaining: 23s
48:	learn: 0.0160529	total: 4.87s	remaining: 23s
49:	learn: 0.0156476	total: 4.96s	remaining: 22.8s
50:	learn: 0.0152283	total: 5.05s	remaining: 22.7s
51:	learn: 0.0149930	total: 5.16s	remaining: 22.6s
52:	learn: 0.0145340	total: 5.26s	remaining: 22.5s
53:	learn: 0.0143124	total: 5.35s	remaining: 22.4s
54:	learn: 0.0139543	total: 5.5s	remaining: 22.5s
55:	learn: 0.0135710	total: 5.58s	remaining: 22.3s
56:	learn: 0.0132796	total: 5.67s	remaining: 22.2s
57:	learn: 0.0130265	total: 5.8s	remaining: 22.2s
58:	learn: 0.0127098	total: 5.89s	remaining: 22.1s
59:	learn: 0.0124788	total: 5.98s	remaining: 21.9s
60:	learn: 0.0123591	total: 6.08s	remaining: 21.8s
61:	learn: 0.0121047	total: 6.18s	remaining: 21.7s
62:	learn: 0.0118991	total: 6.27s	remaining: 21.6s
63:	learn: 0.0117353	total: 6.38s	remaining: 21.5s
64:	learn: 0.0114979	total: 6.47s	remaining: 21.4s
65:	learn: 0.0112711	total: 6.56s	remaining: 21.3s
66:	learn: 0.0110271	total: 6.67s	remaining: 21.2s
67:	learn: 0.0108179	total: 6.76s	remaining: 21.1s
68:	learn: 0.0105910	total: 6.86s	remaining: 21s
69:	learn: 0.0103642	total: 6.98s	remaining: 20.9s
70:	learn: 0.0102449	total: 7.06s	remaining: 20.8s
71:	learn: 0.0099713	total: 7.15s	remaining: 20.6s
72:	learn: 0.0097428	total: 7.27s	remaining: 20.6s
73:	learn: 0.0095448	total: 7.36s	remaining: 20.5s
74:	learn: 0.0093745	total: 7.44s	remaining: 20.3s
75:	learn: 0.0092127	total: 7.55s	remaining: 20.3s
76:	learn: 0.0091108	total: 7.63s	remaining: 20.1s
77:	learn: 0.0089752	total: 7.72s	remaining: 20s
78:	learn: 0.0087962	total: 7.91s	remaining: 20.1s
79:	learn: 0.0086147	total: 8.08s	remaining: 20.2s
80:	learn: 0.0085034	total: 8.25s	remaining: 20.3s
81:	learn: 0.0083789	total: 8.4s	remaining: 20.3s
82:	learn: 0.0082472	total: 8.58s	remaining: 20.4s
83:	learn: 0.0081324	total: 8.77s	remaining: 20.5s
84:	learn: 0.0080155	total: 8.96s	remaining: 20.5s
85:	learn: 0.0078930	total: 9.15s	remaining: 20.6s
86:	learn: 0.0077826	total: 9.34s	remaining: 20.7s
87:	learn: 0.0076342	total: 9.51s	remaining: 20.7s
88:	learn: 0.0074963	total: 9.67s	remaining: 20.8s
89:	learn: 0.0073666	total: 9.84s	remaining: 20.8s
90:	learn: 0.0072570	total: 10s	remaining: 20.8s
91:	learn: 0.0071541	total: 10.2s	remaining: 20.8s
92:	learn: 0.0070557	total: 10.4s	remaining: 20.8s
93:	learn: 0.0069234	total: 10.5s	remaining: 20.8s
94:	learn: 0.0067968	total: 10.7s	remaining: 20.9s
95:	learn: 0.0067011	total: 10.9s	remaining: 20.8s
96:	learn: 0.0066064	total: 11s	remaining: 20.8s
97:	learn: 0.0064786	total: 11.2s	remaining: 20.8s
98:	learn: 0.0063989	total: 11.4s	remaining: 20.8s
99:	learn: 0.0062862	total: 11.5s	remaining: 20.8s
100:	learn: 0.0061805	total: 11.7s	remaining: 20.8s
101:	learn: 0.0061295	total: 11.9s	remaining: 20.7s
102:	learn: 0.0059874	total: 12.1s	remaining: 20.7s
103:	learn: 0.0059381	total: 12.3s	remaining: 20.7s
104:	learn: 0.0058557	total: 12.4s	remaining: 20.7s
105:	learn: 0.0057474	total: 12.6s	remaining: 20.7s
106:	learn: 0.0056580	total: 12.8s	remaining: 20.7s
107:	learn: 0.0056119	total: 13s	remaining: 20.7s
108:	learn: 0.0055342	total: 13.2s	remaining: 20.7s
109:	learn: 0.0054587	total: 13.4s	remaining: 20.6s
110:	learn: 0.0054587	total: 13.5s	remaining: 20.6s
111:	learn: 0.0053846	total: 13.7s	remaining: 20.5s
112:	learn: 0.0053068	total: 13.8s	remaining: 20.4s
113:	learn: 0.0052759	total: 13.9s	remaining: 20.2s
114:	learn: 0.0052194	total: 14s	remaining: 20.1s
115:	learn: 0.0051705	total: 14.1s	remaining: 19.9s
116:	learn: 0.0051066	total: 14.2s	remaining: 19.8s
117:	learn: 0.0050557	total: 14.3s	remaining: 19.6s
118:	learn: 0.0050306	total: 14.4s	remaining: 19.5s
119:	learn: 0.0049517	total: 14.5s	remaining: 19.3s
120:	learn: 0.0049180	total: 14.6s	remaining: 19.2s
121:	learn: 0.0048905	total: 14.7s	remaining: 19.1s
122:	learn: 0.0048484	total: 14.8s	remaining: 18.9s
123:	learn: 0.0047908	total: 14.9s	remaining: 18.7s
124:	learn: 0.0047253	total: 15s	remaining: 18.6s
125:	learn: 0.0046813	total: 15.1s	remaining: 18.4s
126:	learn: 0.0046338	total: 15.2s	remaining: 18.3s
127:	learn: 0.0045969	total: 15.3s	remaining: 18.2s
128:	learn: 0.0045663	total: 15.4s	remaining: 18s
129:	learn: 0.0045286	total: 15.5s	remaining: 17.9s
130:	learn: 0.0044626	total: 15.6s	remaining: 17.7s
131:	learn: 0.0044626	total: 15.7s	remaining: 17.6s
132:	learn: 0.0043989	total: 15.8s	remaining: 17.4s
133:	learn: 0.0043013	total: 15.9s	remaining: 17.3s
134:	learn: 0.0042214	total: 16s	remaining: 17.2s
135:	learn: 0.0041854	total: 16.1s	remaining: 17s
136:	learn: 0.0041403	total: 16.2s	remaining: 16.9s
137:	learn: 0.0041037	total: 16.3s	remaining: 16.8s
138:	learn: 0.0040716	total: 16.4s	remaining: 16.6s
139:	learn: 0.0040418	total: 16.5s	remaining: 16.5s
140:	learn: 0.0039820	total: 16.6s	remaining: 16.3s
141:	learn: 0.0039654	total: 16.7s	remaining: 16.2s
142:	learn: 0.0039538	total: 16.8s	remaining: 16.1s
143:	learn: 0.0039139	total: 16.8s	remaining: 15.9s
144:	learn: 0.0038853	total: 16.9s	remaining: 15.8s
145:	learn: 0.0038722	total: 17s	remaining: 15.6s
146:	learn: 0.0038307	total: 17.1s	remaining: 15.5s
147:	learn: 0.0038004	total: 17.2s	remaining: 15.4s
148:	learn: 0.0037761	total: 17.3s	remaining: 15.2s
149:	learn: 0.0037703	total: 17.4s	remaining: 15.1s
150:	learn: 0.0037125	total: 17.5s	remaining: 15s
151:	learn: 0.0036750	total: 17.6s	remaining: 14.8s
152:	learn: 0.0036167	total: 17.7s	remaining: 14.7s
153:	learn: 0.0036167	total: 17.8s	remaining: 14.6s
154:	learn: 0.0036167	total: 17.9s	remaining: 14.4s
155:	learn: 0.0036049	total: 18s	remaining: 14.3s
156:	learn: 0.0035828	total: 18s	remaining: 14.1s
157:	learn: 0.0035828	total: 18.1s	remaining: 14s
158:	learn: 0.0035609	total: 18.2s	remaining: 13.9s
159:	learn: 0.0035609	total: 18.3s	remaining: 13.7s
160:	learn: 0.0035252	total: 18.4s	remaining: 13.6s
161:	learn: 0.0034770	total: 18.5s	remaining: 13.5s
162:	learn: 0.0034532	total: 18.6s	remaining: 13.3s
163:	learn: 0.0034532	total: 18.7s	remaining: 13.2s
164:	learn: 0.0034370	total: 18.8s	remaining: 13.1s
165:	learn: 0.0034114	total: 18.8s	remaining: 12.9s
166:	learn: 0.0034114	total: 18.9s	remaining: 12.8s
167:	learn: 0.0033731	total: 19s	remaining: 12.7s
168:	learn: 0.0033731	total: 19.1s	remaining: 12.6s
169:	learn: 0.0033731	total: 19.2s	remaining: 12.4s
170:	learn: 0.0033649	total: 19.3s	remaining: 12.3s
171:	learn: 0.0033411	total: 19.4s	remaining: 12.2s
172:	learn: 0.0033230	total: 19.5s	remaining: 12.1s
173:	learn: 0.0033230	total: 19.6s	remaining: 11.9s
174:	learn: 0.0033230	total: 19.6s	remaining: 11.8s
175:	learn: 0.0033025	total: 19.8s	remaining: 11.7s
176:	learn: 0.0032479	total: 19.9s	remaining: 11.6s
177:	learn: 0.0032479	total: 19.9s	remaining: 11.4s
178:	learn: 0.0032225	total: 20.1s	remaining: 11.3s
179:	learn: 0.0032225	total: 20.1s	remaining: 11.2s
180:	learn: 0.0032036	total: 20.2s	remaining: 11.1s
181:	learn: 0.0031963	total: 20.3s	remaining: 10.9s
182:	learn: 0.0031963	total: 20.4s	remaining: 10.8s
183:	learn: 0.0031801	total: 20.5s	remaining: 10.7s
184:	learn: 0.0031429	total: 20.6s	remaining: 10.6s
185:	learn: 0.0031280	total: 20.7s	remaining: 10.5s
186:	learn: 0.0031117	total: 20.8s	remaining: 10.3s
187:	learn: 0.0031117	total: 20.9s	remaining: 10.2s
188:	learn: 0.0031117	total: 20.9s	remaining: 10.1s
189:	learn: 0.0031117	total: 21s	remaining: 9.95s
190:	learn: 0.0031117	total: 21.1s	remaining: 9.82s
191:	learn: 0.0030981	total: 21.2s	remaining: 9.71s
192:	learn: 0.0030981	total: 21.3s	remaining: 9.58s
193:	learn: 0.0030981	total: 21.3s	remaining: 9.46s
194:	learn: 0.0030981	total: 21.4s	remaining: 9.34s
195:	learn: 0.0030981	total: 21.5s	remaining: 9.21s
196:	learn: 0.0030981	total: 21.6s	remaining: 9.09s
197:	learn: 0.0030981	total: 21.7s	remaining: 8.97s
198:	learn: 0.0030981	total: 21.7s	remaining: 8.84s
199:	learn: 0.0030981	total: 21.8s	remaining: 8.72s
200:	learn: 0.0030981	total: 21.9s	remaining: 8.6s
201:	learn: 0.0030981	total: 22s	remaining: 8.48s
202:	learn: 0.0030981	total: 22s	remaining: 8.36s
203:	learn: 0.0030981	total: 22.1s	remaining: 8.24s
204:	learn: 0.0030981	total: 22.2s	remaining: 8.12s
205:	learn: 0.0030981	total: 22.3s	remaining: 8s
206:	learn: 0.0030981	total: 22.3s	remaining: 7.88s
207:	learn: 0.0030981	total: 22.4s	remaining: 7.76s
208:	learn: 0.0030981	total: 22.5s	remaining: 7.65s
209:	learn: 0.0030981	total: 22.6s	remaining: 7.53s
210:	learn: 0.0030981	total: 22.7s	remaining: 7.41s
211:	learn: 0.0030981	total: 22.7s	remaining: 7.29s
212:	learn: 0.0030981	total: 22.8s	remaining: 7.18s
213:	learn: 0.0030981	total: 22.9s	remaining: 7.06s
214:	learn: 0.0030981	total: 23s	remaining: 6.94s
215:	learn: 0.0030981	total: 23s	remaining: 6.83s
216:	learn: 0.0030981	total: 23.1s	remaining: 6.71s
217:	learn: 0.0030980	total: 23.2s	remaining: 6.6s
218:	learn: 0.0030980	total: 23.3s	remaining: 6.49s
219:	learn: 0.0030980	total: 23.4s	remaining: 6.37s
220:	learn: 0.0030980	total: 23.4s	remaining: 6.25s
221:	learn: 0.0030980	total: 23.5s	remaining: 6.15s
222:	learn: 0.0030980	total: 23.6s	remaining: 6.03s
223:	learn: 0.0030980	total: 23.7s	remaining: 5.92s
224:	learn: 0.0030980	total: 23.7s	remaining: 5.8s
225:	learn: 0.0030980	total: 23.9s	remaining: 5.7s
226:	learn: 0.0030980	total: 24s	remaining: 5.6s
227:	learn: 0.0030980	total: 24.1s	remaining: 5.5s
228:	learn: 0.0030980	total: 24.2s	remaining: 5.4s
229:	learn: 0.0030980	total: 24.4s	remaining: 5.3s
230:	learn: 0.0030980	total: 24.5s	remaining: 5.19s
231:	learn: 0.0030980	total: 24.6s	remaining: 5.09s
232:	learn: 0.0030905	total: 24.8s	remaining: 5s
233:	learn: 0.0030905	total: 24.9s	remaining: 4.9s
234:	learn: 0.0030671	total: 25.1s	remaining: 4.8s
235:	learn: 0.0030671	total: 25.2s	remaining: 4.7s
236:	learn: 0.0030670	total: 25.4s	remaining: 4.6s
237:	learn: 0.0030670	total: 25.5s	remaining: 4.5s
238:	learn: 0.0030670	total: 25.7s	remaining: 4.4s
239:	learn: 0.0030670	total: 25.8s	remaining: 4.3s
240:	learn: 0.0030670	total: 25.9s	remaining: 4.2s
241:	learn: 0.0030670	total: 26.1s	remaining: 4.09s
242:	learn: 0.0030670	total: 26.2s	remaining: 3.99s
243:	learn: 0.0030670	total: 26.3s	remaining: 3.88s
244:	learn: 0.0030670	total: 26.5s	remaining: 3.78s
245:	learn: 0.0030670	total: 26.6s	remaining: 3.68s
246:	learn: 0.0030670	total: 26.7s	remaining: 3.57s
247:	learn: 0.0030670	total: 26.8s	remaining: 3.46s
248:	learn: 0.0030670	total: 27s	remaining: 3.35s
249:	learn: 0.0030670	total: 27.1s	remaining: 3.25s
250:	learn: 0.0030670	total: 27.2s	remaining: 3.15s
251:	learn: 0.0030670	total: 27.4s	remaining: 3.04s
252:	learn: 0.0030670	total: 27.5s	remaining: 2.94s
253:	learn: 0.0030669	total: 27.7s	remaining: 2.83s
254:	learn: 0.0030669	total: 27.8s	remaining: 2.73s
255:	learn: 0.0030670	total: 27.9s	remaining: 2.62s
256:	learn: 0.0030669	total: 28s	remaining: 2.51s
257:	learn: 0.0030669	total: 28.2s	remaining: 2.4s
258:	learn: 0.0030669	total: 28.3s	remaining: 2.29s
259:	learn: 0.0030669	total: 28.4s	remaining: 2.19s
260:	learn: 0.0030669	total: 28.6s	remaining: 2.08s
261:	learn: 0.0030669	total: 28.7s	remaining: 1.97s
262:	learn: 0.0030670	total: 28.9s	remaining: 1.87s
263:	learn: 0.0030669	total: 29s	remaining: 1.76s
264:	learn: 0.0030669	total: 29.1s	remaining: 1.65s
265:	learn: 0.0030669	total: 29.3s	remaining: 1.54s
266:	learn: 0.0030669	total: 29.4s	remaining: 1.43s
267:	learn: 0.0030669	total: 29.6s	remaining: 1.32s
268:	learn: 0.0030669	total: 29.7s	remaining: 1.21s
269:	learn: 0.0030669	total: 29.8s	remaining: 1.1s
270:	learn: 0.0030669	total: 30s	remaining: 995ms
271:	learn: 0.0030669	total: 30.1s	remaining: 885ms
272:	learn: 0.0030669	total: 30.2s	remaining: 775ms
273:	learn: 0.0030669	total: 30.3s	remaining: 664ms
274:	learn: 0.0030669	total: 30.4s	remaining: 552ms
275:	learn: 0.0030669	total: 30.4s	remaining: 441ms
276:	learn: 0.0030669	total: 30.5s	remaining: 331ms
277:	learn: 0.0030669	total: 30.6s	remaining: 220ms
278:	learn: 0.0030669	total: 30.7s	remaining: 110ms
279:	learn: 0.0030669	total: 30.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.65
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.83
 - Precision_Test: 24.43
 - Recall_Test: 84.92
 - AUPRC_Test: 77.84
 - Accuracy_Test: 99.53
 - F1-Score_Test: 37.94
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 280
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.77
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5347037	total: 100ms	remaining: 28s
1:	learn: 0.3850156	total: 191ms	remaining: 26.6s
2:	learn: 0.3002381	total: 280ms	remaining: 25.9s
3:	learn: 0.2494719	total: 393ms	remaining: 27.1s
4:	learn: 0.2037251	total: 491ms	remaining: 27s
5:	learn: 0.1736350	total: 580ms	remaining: 26.5s
6:	learn: 0.1518708	total: 698ms	remaining: 27.2s
7:	learn: 0.1366560	total: 782ms	remaining: 26.6s
8:	learn: 0.1203033	total: 878ms	remaining: 26.4s
9:	learn: 0.1080805	total: 990ms	remaining: 26.7s
10:	learn: 0.0994588	total: 1.09s	remaining: 26.8s
11:	learn: 0.0932015	total: 1.19s	remaining: 26.5s
12:	learn: 0.0867167	total: 1.3s	remaining: 26.8s
13:	learn: 0.0830770	total: 1.39s	remaining: 26.5s
14:	learn: 0.0771138	total: 1.49s	remaining: 26.3s
15:	learn: 0.0719015	total: 1.61s	remaining: 26.6s
16:	learn: 0.0680963	total: 1.7s	remaining: 26.4s
17:	learn: 0.0651771	total: 1.79s	remaining: 26.1s
18:	learn: 0.0622409	total: 1.91s	remaining: 26.3s
19:	learn: 0.0595781	total: 2s	remaining: 26s
20:	learn: 0.0574332	total: 2.1s	remaining: 25.9s
21:	learn: 0.0546235	total: 2.23s	remaining: 26.2s
22:	learn: 0.0528469	total: 2.32s	remaining: 25.9s
23:	learn: 0.0508526	total: 2.41s	remaining: 25.7s
24:	learn: 0.0492461	total: 2.52s	remaining: 25.8s
25:	learn: 0.0471687	total: 2.62s	remaining: 25.6s
26:	learn: 0.0452598	total: 2.7s	remaining: 25.3s
27:	learn: 0.0441811	total: 2.81s	remaining: 25.2s
28:	learn: 0.0429274	total: 2.88s	remaining: 25s
29:	learn: 0.0420100	total: 2.96s	remaining: 24.7s
30:	learn: 0.0412511	total: 3.08s	remaining: 24.8s
31:	learn: 0.0401962	total: 3.18s	remaining: 24.7s
32:	learn: 0.0384941	total: 3.27s	remaining: 24.5s
33:	learn: 0.0371926	total: 3.38s	remaining: 24.5s
34:	learn: 0.0358280	total: 3.48s	remaining: 24.3s
35:	learn: 0.0344968	total: 3.56s	remaining: 24.1s
36:	learn: 0.0336376	total: 3.67s	remaining: 24.1s
37:	learn: 0.0329035	total: 3.76s	remaining: 23.9s
38:	learn: 0.0320831	total: 3.84s	remaining: 23.7s
39:	learn: 0.0312682	total: 3.96s	remaining: 23.7s
40:	learn: 0.0302384	total: 4.05s	remaining: 23.6s
41:	learn: 0.0294060	total: 4.14s	remaining: 23.5s
42:	learn: 0.0289408	total: 4.28s	remaining: 23.6s
43:	learn: 0.0281482	total: 4.39s	remaining: 23.6s
44:	learn: 0.0274976	total: 4.48s	remaining: 23.4s
45:	learn: 0.0270304	total: 4.59s	remaining: 23.4s
46:	learn: 0.0264850	total: 4.68s	remaining: 23.2s
47:	learn: 0.0261028	total: 4.76s	remaining: 23s
48:	learn: 0.0254415	total: 4.87s	remaining: 23s
49:	learn: 0.0246842	total: 4.96s	remaining: 22.8s
50:	learn: 0.0241645	total: 5.05s	remaining: 22.7s
51:	learn: 0.0236481	total: 5.18s	remaining: 22.7s
52:	learn: 0.0230681	total: 5.28s	remaining: 22.6s
53:	learn: 0.0226593	total: 5.36s	remaining: 22.5s
54:	learn: 0.0221212	total: 5.48s	remaining: 22.4s
55:	learn: 0.0218487	total: 5.56s	remaining: 22.2s
56:	learn: 0.0215806	total: 5.64s	remaining: 22.1s
57:	learn: 0.0212947	total: 5.75s	remaining: 22s
58:	learn: 0.0208917	total: 5.83s	remaining: 21.8s
59:	learn: 0.0203869	total: 5.91s	remaining: 21.7s
60:	learn: 0.0201125	total: 6.03s	remaining: 21.6s
61:	learn: 0.0195262	total: 6.12s	remaining: 21.5s
62:	learn: 0.0192425	total: 6.21s	remaining: 21.4s
63:	learn: 0.0187154	total: 6.34s	remaining: 21.4s
64:	learn: 0.0184142	total: 6.43s	remaining: 21.3s
65:	learn: 0.0179183	total: 6.52s	remaining: 21.1s
66:	learn: 0.0174361	total: 6.64s	remaining: 21.1s
67:	learn: 0.0171390	total: 6.73s	remaining: 21s
68:	learn: 0.0168283	total: 6.82s	remaining: 20.9s
69:	learn: 0.0166483	total: 6.93s	remaining: 20.8s
70:	learn: 0.0163134	total: 7.02s	remaining: 20.7s
71:	learn: 0.0161321	total: 7.11s	remaining: 20.5s
72:	learn: 0.0157715	total: 7.24s	remaining: 20.5s
73:	learn: 0.0154734	total: 7.39s	remaining: 20.6s
74:	learn: 0.0151493	total: 7.58s	remaining: 20.7s
75:	learn: 0.0147822	total: 7.74s	remaining: 20.8s
76:	learn: 0.0146028	total: 7.9s	remaining: 20.8s
77:	learn: 0.0142406	total: 8.08s	remaining: 20.9s
78:	learn: 0.0140028	total: 8.25s	remaining: 21s
79:	learn: 0.0137302	total: 8.41s	remaining: 21s
80:	learn: 0.0136262	total: 8.57s	remaining: 21.1s
81:	learn: 0.0133478	total: 8.74s	remaining: 21.1s
82:	learn: 0.0131512	total: 8.94s	remaining: 21.2s
83:	learn: 0.0130247	total: 9.11s	remaining: 21.3s
84:	learn: 0.0128584	total: 9.26s	remaining: 21.3s
85:	learn: 0.0125716	total: 9.43s	remaining: 21.3s
86:	learn: 0.0123401	total: 9.61s	remaining: 21.3s
87:	learn: 0.0121295	total: 9.78s	remaining: 21.3s
88:	learn: 0.0119616	total: 9.96s	remaining: 21.4s
89:	learn: 0.0118104	total: 10.1s	remaining: 21.4s
90:	learn: 0.0116186	total: 10.3s	remaining: 21.4s
91:	learn: 0.0113881	total: 10.5s	remaining: 21.4s
92:	learn: 0.0113074	total: 10.7s	remaining: 21.5s
93:	learn: 0.0110550	total: 10.9s	remaining: 21.5s
94:	learn: 0.0109219	total: 11.1s	remaining: 21.6s
95:	learn: 0.0107424	total: 11.3s	remaining: 21.7s
96:	learn: 0.0106765	total: 11.5s	remaining: 21.7s
97:	learn: 0.0105434	total: 11.7s	remaining: 21.7s
98:	learn: 0.0104096	total: 11.9s	remaining: 21.8s
99:	learn: 0.0102403	total: 12.1s	remaining: 21.8s
100:	learn: 0.0100770	total: 12.3s	remaining: 21.8s
101:	learn: 0.0098478	total: 12.5s	remaining: 21.8s
102:	learn: 0.0096270	total: 12.7s	remaining: 21.8s
103:	learn: 0.0095178	total: 12.8s	remaining: 21.7s
104:	learn: 0.0094187	total: 13s	remaining: 21.7s
105:	learn: 0.0092392	total: 13.2s	remaining: 21.6s
106:	learn: 0.0091732	total: 13.4s	remaining: 21.6s
107:	learn: 0.0090955	total: 13.5s	remaining: 21.5s
108:	learn: 0.0089796	total: 13.7s	remaining: 21.5s
109:	learn: 0.0088141	total: 14s	remaining: 21.6s
110:	learn: 0.0087309	total: 14.2s	remaining: 21.6s
111:	learn: 0.0085396	total: 14.4s	remaining: 21.5s
112:	learn: 0.0084665	total: 14.5s	remaining: 21.5s
113:	learn: 0.0083657	total: 14.7s	remaining: 21.4s
114:	learn: 0.0082232	total: 14.9s	remaining: 21.3s
115:	learn: 0.0080734	total: 15s	remaining: 21.3s
116:	learn: 0.0079425	total: 15.2s	remaining: 21.2s
117:	learn: 0.0077982	total: 15.4s	remaining: 21.1s
118:	learn: 0.0076635	total: 15.5s	remaining: 21s
119:	learn: 0.0076152	total: 15.7s	remaining: 21s
120:	learn: 0.0075319	total: 15.9s	remaining: 20.9s
121:	learn: 0.0074133	total: 16.1s	remaining: 20.8s
122:	learn: 0.0073523	total: 16.2s	remaining: 20.7s
123:	learn: 0.0072319	total: 16.4s	remaining: 20.6s
124:	learn: 0.0071693	total: 16.6s	remaining: 20.5s
125:	learn: 0.0071409	total: 16.7s	remaining: 20.4s
126:	learn: 0.0071106	total: 16.9s	remaining: 20.3s
127:	learn: 0.0069918	total: 17s	remaining: 20.2s
128:	learn: 0.0069058	total: 17.2s	remaining: 20.1s
129:	learn: 0.0068758	total: 17.4s	remaining: 20s
130:	learn: 0.0068282	total: 17.5s	remaining: 19.9s
131:	learn: 0.0067974	total: 17.7s	remaining: 19.8s
132:	learn: 0.0067278	total: 17.9s	remaining: 19.8s
133:	learn: 0.0067029	total: 18s	remaining: 19.7s
134:	learn: 0.0066065	total: 18.2s	remaining: 19.6s
135:	learn: 0.0064845	total: 18.4s	remaining: 19.5s
136:	learn: 0.0063603	total: 18.6s	remaining: 19.4s
137:	learn: 0.0062867	total: 18.7s	remaining: 19.2s
138:	learn: 0.0062708	total: 18.7s	remaining: 19s
139:	learn: 0.0062187	total: 18.9s	remaining: 18.9s
140:	learn: 0.0061462	total: 19s	remaining: 18.7s
141:	learn: 0.0060890	total: 19.1s	remaining: 18.6s
142:	learn: 0.0060679	total: 19.2s	remaining: 18.4s
143:	learn: 0.0060113	total: 19.3s	remaining: 18.2s
144:	learn: 0.0059535	total: 19.4s	remaining: 18.1s
145:	learn: 0.0059041	total: 19.5s	remaining: 17.9s
146:	learn: 0.0058223	total: 19.6s	remaining: 17.7s
147:	learn: 0.0058103	total: 19.7s	remaining: 17.5s
148:	learn: 0.0057280	total: 19.8s	remaining: 17.4s
149:	learn: 0.0057096	total: 19.8s	remaining: 17.2s
150:	learn: 0.0056417	total: 20s	remaining: 17s
151:	learn: 0.0056032	total: 20s	remaining: 16.9s
152:	learn: 0.0055767	total: 20.1s	remaining: 16.7s
153:	learn: 0.0055326	total: 20.2s	remaining: 16.6s
154:	learn: 0.0054628	total: 20.3s	remaining: 16.4s
155:	learn: 0.0054185	total: 20.4s	remaining: 16.2s
156:	learn: 0.0053671	total: 20.5s	remaining: 16.1s
157:	learn: 0.0052974	total: 20.6s	remaining: 15.9s
158:	learn: 0.0052363	total: 20.7s	remaining: 15.7s
159:	learn: 0.0051699	total: 20.8s	remaining: 15.6s
160:	learn: 0.0051322	total: 20.9s	remaining: 15.4s
161:	learn: 0.0050718	total: 21s	remaining: 15.3s
162:	learn: 0.0050198	total: 21.1s	remaining: 15.1s
163:	learn: 0.0049839	total: 21.2s	remaining: 15s
164:	learn: 0.0049259	total: 21.3s	remaining: 14.8s
165:	learn: 0.0048048	total: 21.4s	remaining: 14.7s
166:	learn: 0.0047866	total: 21.5s	remaining: 14.5s
167:	learn: 0.0047545	total: 21.6s	remaining: 14.4s
168:	learn: 0.0046960	total: 21.7s	remaining: 14.3s
169:	learn: 0.0046474	total: 21.8s	remaining: 14.1s
170:	learn: 0.0046120	total: 21.9s	remaining: 13.9s
171:	learn: 0.0045623	total: 22s	remaining: 13.8s
172:	learn: 0.0045334	total: 22.1s	remaining: 13.7s
173:	learn: 0.0044857	total: 22.2s	remaining: 13.5s
174:	learn: 0.0044568	total: 22.3s	remaining: 13.4s
175:	learn: 0.0044032	total: 22.4s	remaining: 13.2s
176:	learn: 0.0043791	total: 22.5s	remaining: 13.1s
177:	learn: 0.0043543	total: 22.6s	remaining: 12.9s
178:	learn: 0.0043240	total: 22.7s	remaining: 12.8s
179:	learn: 0.0043086	total: 22.7s	remaining: 12.6s
180:	learn: 0.0042517	total: 22.9s	remaining: 12.5s
181:	learn: 0.0042290	total: 22.9s	remaining: 12.3s
182:	learn: 0.0042080	total: 23s	remaining: 12.2s
183:	learn: 0.0041670	total: 23.1s	remaining: 12.1s
184:	learn: 0.0040808	total: 23.3s	remaining: 11.9s
185:	learn: 0.0040808	total: 23.3s	remaining: 11.8s
186:	learn: 0.0040479	total: 23.5s	remaining: 11.7s
187:	learn: 0.0040202	total: 23.6s	remaining: 11.5s
188:	learn: 0.0039831	total: 23.6s	remaining: 11.4s
189:	learn: 0.0039831	total: 23.7s	remaining: 11.2s
190:	learn: 0.0039607	total: 23.8s	remaining: 11.1s
191:	learn: 0.0039293	total: 23.9s	remaining: 11s
192:	learn: 0.0039293	total: 24s	remaining: 10.8s
193:	learn: 0.0039035	total: 24.1s	remaining: 10.7s
194:	learn: 0.0038860	total: 24.2s	remaining: 10.5s
195:	learn: 0.0038210	total: 24.3s	remaining: 10.4s
196:	learn: 0.0037801	total: 24.4s	remaining: 10.3s
197:	learn: 0.0037698	total: 24.5s	remaining: 10.1s
198:	learn: 0.0037698	total: 24.6s	remaining: 10s
199:	learn: 0.0037698	total: 24.7s	remaining: 9.86s
200:	learn: 0.0037697	total: 24.8s	remaining: 9.73s
201:	learn: 0.0037305	total: 24.9s	remaining: 9.62s
202:	learn: 0.0037032	total: 25.1s	remaining: 9.51s
203:	learn: 0.0036803	total: 25.2s	remaining: 9.4s
204:	learn: 0.0036418	total: 25.4s	remaining: 9.3s
205:	learn: 0.0036418	total: 25.6s	remaining: 9.18s
206:	learn: 0.0036044	total: 25.7s	remaining: 9.07s
207:	learn: 0.0035846	total: 25.9s	remaining: 8.97s
208:	learn: 0.0035518	total: 26.1s	remaining: 8.86s
209:	learn: 0.0035125	total: 26.3s	remaining: 8.76s
210:	learn: 0.0035125	total: 26.4s	remaining: 8.64s
211:	learn: 0.0034586	total: 26.6s	remaining: 8.52s
212:	learn: 0.0034586	total: 26.7s	remaining: 8.4s
213:	learn: 0.0034422	total: 26.9s	remaining: 8.29s
214:	learn: 0.0034269	total: 27.1s	remaining: 8.18s
215:	learn: 0.0034269	total: 27.2s	remaining: 8.06s
216:	learn: 0.0034269	total: 27.3s	remaining: 7.93s
217:	learn: 0.0034269	total: 27.5s	remaining: 7.81s
218:	learn: 0.0034269	total: 27.6s	remaining: 7.69s
219:	learn: 0.0034269	total: 27.7s	remaining: 7.56s
220:	learn: 0.0034189	total: 27.9s	remaining: 7.44s
221:	learn: 0.0033994	total: 28.1s	remaining: 7.33s
222:	learn: 0.0033855	total: 28.2s	remaining: 7.21s
223:	learn: 0.0033603	total: 28.4s	remaining: 7.09s
224:	learn: 0.0033472	total: 28.6s	remaining: 6.98s
225:	learn: 0.0033206	total: 28.7s	remaining: 6.86s
226:	learn: 0.0033125	total: 28.9s	remaining: 6.75s
227:	learn: 0.0033012	total: 29.1s	remaining: 6.63s
228:	learn: 0.0033012	total: 29.2s	remaining: 6.5s
229:	learn: 0.0033012	total: 29.3s	remaining: 6.38s
230:	learn: 0.0033012	total: 29.5s	remaining: 6.25s
231:	learn: 0.0033012	total: 29.6s	remaining: 6.13s
232:	learn: 0.0033012	total: 29.8s	remaining: 6s
233:	learn: 0.0032687	total: 30s	remaining: 5.89s
234:	learn: 0.0032687	total: 30.1s	remaining: 5.76s
235:	learn: 0.0032687	total: 30.2s	remaining: 5.64s
236:	learn: 0.0032687	total: 30.4s	remaining: 5.51s
237:	learn: 0.0032687	total: 30.5s	remaining: 5.38s
238:	learn: 0.0032687	total: 30.6s	remaining: 5.25s
239:	learn: 0.0032687	total: 30.7s	remaining: 5.11s
240:	learn: 0.0032687	total: 30.8s	remaining: 4.98s
241:	learn: 0.0032687	total: 30.8s	remaining: 4.84s
242:	learn: 0.0032687	total: 30.9s	remaining: 4.7s
243:	learn: 0.0032687	total: 31s	remaining: 4.57s
244:	learn: 0.0032687	total: 31.1s	remaining: 4.44s
245:	learn: 0.0032687	total: 31.1s	remaining: 4.3s
246:	learn: 0.0032687	total: 31.2s	remaining: 4.17s
247:	learn: 0.0032687	total: 31.3s	remaining: 4.04s
248:	learn: 0.0032687	total: 31.4s	remaining: 3.9s
249:	learn: 0.0032687	total: 31.4s	remaining: 3.77s
250:	learn: 0.0032687	total: 31.5s	remaining: 3.64s
251:	learn: 0.0032687	total: 31.6s	remaining: 3.51s
252:	learn: 0.0032687	total: 31.7s	remaining: 3.38s
253:	learn: 0.0032687	total: 31.8s	remaining: 3.25s
254:	learn: 0.0032687	total: 31.8s	remaining: 3.12s
255:	learn: 0.0032687	total: 31.9s	remaining: 2.99s
256:	learn: 0.0032687	total: 32s	remaining: 2.86s
257:	learn: 0.0032687	total: 32.1s	remaining: 2.73s
258:	learn: 0.0032687	total: 32.2s	remaining: 2.61s
259:	learn: 0.0032687	total: 32.2s	remaining: 2.48s
260:	learn: 0.0032687	total: 32.3s	remaining: 2.35s
261:	learn: 0.0032687	total: 32.4s	remaining: 2.23s
262:	learn: 0.0032687	total: 32.5s	remaining: 2.1s
263:	learn: 0.0032687	total: 32.5s	remaining: 1.97s
264:	learn: 0.0032687	total: 32.6s	remaining: 1.85s
265:	learn: 0.0032687	total: 32.7s	remaining: 1.72s
266:	learn: 0.0032687	total: 32.8s	remaining: 1.6s
267:	learn: 0.0032687	total: 32.9s	remaining: 1.47s
268:	learn: 0.0032687	total: 33s	remaining: 1.35s
269:	learn: 0.0032687	total: 33s	remaining: 1.22s
270:	learn: 0.0032687	total: 33.1s	remaining: 1.1s
271:	learn: 0.0032687	total: 33.2s	remaining: 977ms
272:	learn: 0.0032687	total: 33.3s	remaining: 853ms
273:	learn: 0.0032687	total: 33.4s	remaining: 731ms
274:	learn: 0.0032687	total: 33.4s	remaining: 608ms
275:	learn: 0.0032687	total: 33.5s	remaining: 486ms
276:	learn: 0.0032687	total: 33.6s	remaining: 364ms
277:	learn: 0.0032687	total: 33.7s	remaining: 242ms
278:	learn: 0.0032687	total: 33.7s	remaining: 121ms
279:	learn: 0.0032687	total: 33.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 28.13
 - Recall_Test: 87.30
 - AUPRC_Test: 79.68
 - Accuracy_Test: 99.60
 - F1-Score_Test: 42.55
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 280
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.77
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5307739	total: 82ms	remaining: 22.9s
1:	learn: 0.3859729	total: 168ms	remaining: 23.4s
2:	learn: 0.2841797	total: 256ms	remaining: 23.6s
3:	learn: 0.2313302	total: 368ms	remaining: 25.4s
4:	learn: 0.1906811	total: 453ms	remaining: 24.9s
5:	learn: 0.1616080	total: 550ms	remaining: 25.1s
6:	learn: 0.1387538	total: 668ms	remaining: 26.1s
7:	learn: 0.1255683	total: 755ms	remaining: 25.7s
8:	learn: 0.1129238	total: 857ms	remaining: 25.8s
9:	learn: 0.1017146	total: 974ms	remaining: 26.3s
10:	learn: 0.0929084	total: 1.06s	remaining: 25.9s
11:	learn: 0.0862523	total: 1.15s	remaining: 25.7s
12:	learn: 0.0801182	total: 1.27s	remaining: 26.2s
13:	learn: 0.0765040	total: 1.35s	remaining: 25.7s
14:	learn: 0.0713887	total: 1.45s	remaining: 25.5s
15:	learn: 0.0658097	total: 1.57s	remaining: 25.9s
16:	learn: 0.0623433	total: 1.66s	remaining: 25.6s
17:	learn: 0.0591520	total: 1.75s	remaining: 25.5s
18:	learn: 0.0551584	total: 1.88s	remaining: 25.9s
19:	learn: 0.0526240	total: 1.98s	remaining: 25.8s
20:	learn: 0.0501705	total: 2.07s	remaining: 25.5s
21:	learn: 0.0482449	total: 2.2s	remaining: 25.8s
22:	learn: 0.0457152	total: 2.3s	remaining: 25.7s
23:	learn: 0.0439295	total: 2.4s	remaining: 25.6s
24:	learn: 0.0424093	total: 2.52s	remaining: 25.7s
25:	learn: 0.0407512	total: 2.61s	remaining: 25.5s
26:	learn: 0.0389045	total: 2.7s	remaining: 25.3s
27:	learn: 0.0373266	total: 2.82s	remaining: 25.4s
28:	learn: 0.0360741	total: 2.94s	remaining: 25.4s
29:	learn: 0.0351982	total: 3.03s	remaining: 25.3s
30:	learn: 0.0342588	total: 3.14s	remaining: 25.2s
31:	learn: 0.0329724	total: 3.25s	remaining: 25.2s
32:	learn: 0.0321145	total: 3.35s	remaining: 25.1s
33:	learn: 0.0311404	total: 3.44s	remaining: 24.9s
34:	learn: 0.0302315	total: 3.57s	remaining: 25s
35:	learn: 0.0293814	total: 3.67s	remaining: 24.8s
36:	learn: 0.0288257	total: 3.75s	remaining: 24.6s
37:	learn: 0.0281026	total: 3.86s	remaining: 24.6s
38:	learn: 0.0277997	total: 3.96s	remaining: 24.5s
39:	learn: 0.0271995	total: 4.05s	remaining: 24.3s
40:	learn: 0.0265780	total: 4.16s	remaining: 24.2s
41:	learn: 0.0260800	total: 4.25s	remaining: 24.1s
42:	learn: 0.0255668	total: 4.33s	remaining: 23.9s
43:	learn: 0.0246760	total: 4.45s	remaining: 23.8s
44:	learn: 0.0238680	total: 4.59s	remaining: 24s
45:	learn: 0.0230750	total: 4.73s	remaining: 24.1s
46:	learn: 0.0225051	total: 4.9s	remaining: 24.3s
47:	learn: 0.0221654	total: 5.08s	remaining: 24.5s
48:	learn: 0.0217141	total: 5.24s	remaining: 24.7s
49:	learn: 0.0211439	total: 5.44s	remaining: 25s
50:	learn: 0.0205619	total: 5.6s	remaining: 25.2s
51:	learn: 0.0198784	total: 5.78s	remaining: 25.4s
52:	learn: 0.0193243	total: 5.97s	remaining: 25.6s
53:	learn: 0.0187901	total: 6.16s	remaining: 25.8s
54:	learn: 0.0183051	total: 6.32s	remaining: 25.9s
55:	learn: 0.0179048	total: 6.51s	remaining: 26s
56:	learn: 0.0176728	total: 6.68s	remaining: 26.1s
57:	learn: 0.0173353	total: 6.85s	remaining: 26.2s
58:	learn: 0.0168931	total: 7.02s	remaining: 26.3s
59:	learn: 0.0165118	total: 7.21s	remaining: 26.4s
60:	learn: 0.0163514	total: 7.37s	remaining: 26.4s
61:	learn: 0.0158726	total: 7.56s	remaining: 26.6s
62:	learn: 0.0156076	total: 7.74s	remaining: 26.7s
63:	learn: 0.0152923	total: 7.93s	remaining: 26.8s
64:	learn: 0.0149160	total: 8.1s	remaining: 26.8s
65:	learn: 0.0145871	total: 8.28s	remaining: 26.8s
66:	learn: 0.0142936	total: 8.44s	remaining: 26.8s
67:	learn: 0.0140516	total: 8.6s	remaining: 26.8s
68:	learn: 0.0137853	total: 8.78s	remaining: 26.8s
69:	learn: 0.0134311	total: 8.95s	remaining: 26.9s
70:	learn: 0.0131680	total: 9.12s	remaining: 26.8s
71:	learn: 0.0128984	total: 9.29s	remaining: 26.9s
72:	learn: 0.0126618	total: 9.47s	remaining: 26.9s
73:	learn: 0.0124582	total: 9.63s	remaining: 26.8s
74:	learn: 0.0121332	total: 9.81s	remaining: 26.8s
75:	learn: 0.0119151	total: 9.99s	remaining: 26.8s
76:	learn: 0.0117347	total: 10.2s	remaining: 26.8s
77:	learn: 0.0116504	total: 10.3s	remaining: 26.7s
78:	learn: 0.0114067	total: 10.4s	remaining: 26.5s
79:	learn: 0.0111929	total: 10.5s	remaining: 26.2s
80:	learn: 0.0109453	total: 10.6s	remaining: 26.1s
81:	learn: 0.0107477	total: 10.7s	remaining: 25.8s
82:	learn: 0.0105206	total: 10.8s	remaining: 25.6s
83:	learn: 0.0104404	total: 10.9s	remaining: 25.5s
84:	learn: 0.0102796	total: 11s	remaining: 25.2s
85:	learn: 0.0100709	total: 11.1s	remaining: 25s
86:	learn: 0.0099299	total: 11.2s	remaining: 24.8s
87:	learn: 0.0098328	total: 11.3s	remaining: 24.6s
88:	learn: 0.0095971	total: 11.4s	remaining: 24.4s
89:	learn: 0.0094988	total: 11.5s	remaining: 24.3s
90:	learn: 0.0093142	total: 11.6s	remaining: 24.1s
91:	learn: 0.0092121	total: 11.7s	remaining: 23.9s
92:	learn: 0.0090710	total: 11.8s	remaining: 23.8s
93:	learn: 0.0089188	total: 11.9s	remaining: 23.6s
94:	learn: 0.0088661	total: 12s	remaining: 23.4s
95:	learn: 0.0087628	total: 12.1s	remaining: 23.2s
96:	learn: 0.0086339	total: 12.2s	remaining: 23s
97:	learn: 0.0084699	total: 12.3s	remaining: 22.8s
98:	learn: 0.0082435	total: 12.4s	remaining: 22.7s
99:	learn: 0.0080648	total: 12.5s	remaining: 22.5s
100:	learn: 0.0079408	total: 12.6s	remaining: 22.3s
101:	learn: 0.0077467	total: 12.7s	remaining: 22.2s
102:	learn: 0.0076357	total: 12.8s	remaining: 22s
103:	learn: 0.0075423	total: 12.9s	remaining: 21.8s
104:	learn: 0.0074920	total: 13s	remaining: 21.7s
105:	learn: 0.0073816	total: 13.1s	remaining: 21.5s
106:	learn: 0.0073044	total: 13.2s	remaining: 21.3s
107:	learn: 0.0071824	total: 13.3s	remaining: 21.2s
108:	learn: 0.0070306	total: 13.4s	remaining: 21s
109:	learn: 0.0069865	total: 13.5s	remaining: 20.8s
110:	learn: 0.0069047	total: 13.6s	remaining: 20.7s
111:	learn: 0.0068337	total: 13.7s	remaining: 20.5s
112:	learn: 0.0067442	total: 13.8s	remaining: 20.4s
113:	learn: 0.0066559	total: 13.9s	remaining: 20.2s
114:	learn: 0.0065477	total: 14s	remaining: 20.1s
115:	learn: 0.0064269	total: 14.1s	remaining: 19.9s
116:	learn: 0.0062886	total: 14.2s	remaining: 19.8s
117:	learn: 0.0061947	total: 14.3s	remaining: 19.6s
118:	learn: 0.0061194	total: 14.4s	remaining: 19.4s
119:	learn: 0.0060782	total: 14.5s	remaining: 19.3s
120:	learn: 0.0060045	total: 14.6s	remaining: 19.2s
121:	learn: 0.0059605	total: 14.7s	remaining: 19s
122:	learn: 0.0058744	total: 14.8s	remaining: 18.9s
123:	learn: 0.0057701	total: 14.9s	remaining: 18.7s
124:	learn: 0.0057144	total: 15s	remaining: 18.6s
125:	learn: 0.0056773	total: 15.1s	remaining: 18.4s
126:	learn: 0.0056109	total: 15.2s	remaining: 18.3s
127:	learn: 0.0055519	total: 15.2s	remaining: 18.1s
128:	learn: 0.0054716	total: 15.4s	remaining: 18s
129:	learn: 0.0054200	total: 15.5s	remaining: 17.8s
130:	learn: 0.0053792	total: 15.6s	remaining: 17.7s
131:	learn: 0.0052784	total: 15.7s	remaining: 17.6s
132:	learn: 0.0052245	total: 15.8s	remaining: 17.4s
133:	learn: 0.0051987	total: 15.8s	remaining: 17.3s
134:	learn: 0.0051191	total: 16s	remaining: 17.2s
135:	learn: 0.0050923	total: 16.1s	remaining: 17s
136:	learn: 0.0050217	total: 16.2s	remaining: 16.9s
137:	learn: 0.0049297	total: 16.3s	remaining: 16.8s
138:	learn: 0.0048859	total: 16.4s	remaining: 16.6s
139:	learn: 0.0048679	total: 16.4s	remaining: 16.4s
140:	learn: 0.0048358	total: 16.6s	remaining: 16.3s
141:	learn: 0.0048241	total: 16.6s	remaining: 16.2s
142:	learn: 0.0047249	total: 16.7s	remaining: 16s
143:	learn: 0.0046953	total: 16.8s	remaining: 15.9s
144:	learn: 0.0046188	total: 16.9s	remaining: 15.8s
145:	learn: 0.0045986	total: 17s	remaining: 15.6s
146:	learn: 0.0045514	total: 17.1s	remaining: 15.5s
147:	learn: 0.0045129	total: 17.2s	remaining: 15.3s
148:	learn: 0.0044762	total: 17.3s	remaining: 15.2s
149:	learn: 0.0044371	total: 17.4s	remaining: 15.1s
150:	learn: 0.0044147	total: 17.5s	remaining: 14.9s
151:	learn: 0.0043735	total: 17.6s	remaining: 14.8s
152:	learn: 0.0043337	total: 17.7s	remaining: 14.7s
153:	learn: 0.0043128	total: 17.8s	remaining: 14.5s
154:	learn: 0.0042924	total: 17.9s	remaining: 14.4s
155:	learn: 0.0042177	total: 18s	remaining: 14.3s
156:	learn: 0.0041985	total: 18.1s	remaining: 14.2s
157:	learn: 0.0041467	total: 18.1s	remaining: 14s
158:	learn: 0.0041122	total: 18.3s	remaining: 13.9s
159:	learn: 0.0040833	total: 18.3s	remaining: 13.8s
160:	learn: 0.0040490	total: 18.4s	remaining: 13.6s
161:	learn: 0.0040269	total: 18.5s	remaining: 13.5s
162:	learn: 0.0039994	total: 18.6s	remaining: 13.4s
163:	learn: 0.0039350	total: 18.7s	remaining: 13.2s
164:	learn: 0.0038959	total: 18.8s	remaining: 13.1s
165:	learn: 0.0038382	total: 18.9s	remaining: 13s
166:	learn: 0.0038176	total: 19s	remaining: 12.9s
167:	learn: 0.0037637	total: 19.1s	remaining: 12.8s
168:	learn: 0.0037061	total: 19.2s	remaining: 12.6s
169:	learn: 0.0036902	total: 19.3s	remaining: 12.5s
170:	learn: 0.0036342	total: 19.4s	remaining: 12.4s
171:	learn: 0.0035970	total: 19.5s	remaining: 12.3s
172:	learn: 0.0035842	total: 19.6s	remaining: 12.1s
173:	learn: 0.0035466	total: 19.8s	remaining: 12s
174:	learn: 0.0035375	total: 19.8s	remaining: 11.9s
175:	learn: 0.0034924	total: 19.9s	remaining: 11.8s
176:	learn: 0.0034924	total: 20s	remaining: 11.6s
177:	learn: 0.0034693	total: 20.1s	remaining: 11.5s
178:	learn: 0.0034303	total: 20.2s	remaining: 11.4s
179:	learn: 0.0034119	total: 20.3s	remaining: 11.3s
180:	learn: 0.0034119	total: 20.5s	remaining: 11.2s
181:	learn: 0.0033942	total: 20.6s	remaining: 11.1s
182:	learn: 0.0033941	total: 20.8s	remaining: 11s
183:	learn: 0.0033465	total: 21s	remaining: 10.9s
184:	learn: 0.0033337	total: 21.1s	remaining: 10.8s
185:	learn: 0.0033337	total: 21.3s	remaining: 10.7s
186:	learn: 0.0033104	total: 21.4s	remaining: 10.7s
187:	learn: 0.0032943	total: 21.6s	remaining: 10.6s
188:	learn: 0.0032943	total: 21.7s	remaining: 10.5s
189:	learn: 0.0032943	total: 21.9s	remaining: 10.4s
190:	learn: 0.0032943	total: 22s	remaining: 10.2s
191:	learn: 0.0032827	total: 22.1s	remaining: 10.1s
192:	learn: 0.0032643	total: 22.3s	remaining: 10s
193:	learn: 0.0032441	total: 22.5s	remaining: 9.96s
194:	learn: 0.0032382	total: 22.6s	remaining: 9.86s
195:	learn: 0.0032382	total: 22.8s	remaining: 9.76s
196:	learn: 0.0032185	total: 22.9s	remaining: 9.66s
197:	learn: 0.0031960	total: 23.1s	remaining: 9.56s
198:	learn: 0.0031795	total: 23.3s	remaining: 9.47s
199:	learn: 0.0031653	total: 23.4s	remaining: 9.37s
200:	learn: 0.0031653	total: 23.6s	remaining: 9.26s
201:	learn: 0.0031258	total: 23.7s	remaining: 9.17s
202:	learn: 0.0031258	total: 23.9s	remaining: 9.07s
203:	learn: 0.0031258	total: 24s	remaining: 8.96s
204:	learn: 0.0031258	total: 24.2s	remaining: 8.84s
205:	learn: 0.0030941	total: 24.4s	remaining: 8.75s
206:	learn: 0.0030941	total: 24.5s	remaining: 8.64s
207:	learn: 0.0030941	total: 24.6s	remaining: 8.53s
208:	learn: 0.0030942	total: 24.8s	remaining: 8.41s
209:	learn: 0.0030941	total: 24.9s	remaining: 8.31s
210:	learn: 0.0030941	total: 25s	remaining: 8.19s
211:	learn: 0.0030941	total: 25.2s	remaining: 8.07s
212:	learn: 0.0030941	total: 25.3s	remaining: 7.96s
213:	learn: 0.0030941	total: 25.4s	remaining: 7.85s
214:	learn: 0.0030941	total: 25.6s	remaining: 7.73s
215:	learn: 0.0030941	total: 25.7s	remaining: 7.61s
216:	learn: 0.0030941	total: 25.9s	remaining: 7.5s
217:	learn: 0.0030941	total: 26s	remaining: 7.4s
218:	learn: 0.0030941	total: 26.1s	remaining: 7.28s
219:	learn: 0.0030941	total: 26.2s	remaining: 7.14s
220:	learn: 0.0030941	total: 26.3s	remaining: 7.02s
221:	learn: 0.0030941	total: 26.4s	remaining: 6.88s
222:	learn: 0.0030941	total: 26.4s	remaining: 6.75s
223:	learn: 0.0030941	total: 26.5s	remaining: 6.62s
224:	learn: 0.0030941	total: 26.6s	remaining: 6.5s
225:	learn: 0.0030941	total: 26.6s	remaining: 6.37s
226:	learn: 0.0030941	total: 26.7s	remaining: 6.24s
227:	learn: 0.0030941	total: 26.8s	remaining: 6.11s
228:	learn: 0.0030941	total: 26.9s	remaining: 5.99s
229:	learn: 0.0030941	total: 26.9s	remaining: 5.86s
230:	learn: 0.0030941	total: 27s	remaining: 5.73s
231:	learn: 0.0030941	total: 27.1s	remaining: 5.61s
232:	learn: 0.0030941	total: 27.2s	remaining: 5.48s
233:	learn: 0.0030941	total: 27.3s	remaining: 5.36s
234:	learn: 0.0030941	total: 27.3s	remaining: 5.24s
235:	learn: 0.0030941	total: 27.4s	remaining: 5.11s
236:	learn: 0.0030941	total: 27.5s	remaining: 4.99s
237:	learn: 0.0030941	total: 27.6s	remaining: 4.87s
238:	learn: 0.0030941	total: 27.6s	remaining: 4.74s
239:	learn: 0.0030941	total: 27.7s	remaining: 4.62s
240:	learn: 0.0030941	total: 27.8s	remaining: 4.51s
241:	learn: 0.0030941	total: 27.9s	remaining: 4.38s
242:	learn: 0.0030941	total: 28s	remaining: 4.26s
243:	learn: 0.0030941	total: 28.1s	remaining: 4.14s
244:	learn: 0.0030941	total: 28.2s	remaining: 4.02s
245:	learn: 0.0030941	total: 28.2s	remaining: 3.9s
246:	learn: 0.0030941	total: 28.3s	remaining: 3.78s
247:	learn: 0.0030941	total: 28.4s	remaining: 3.66s
248:	learn: 0.0030941	total: 28.5s	remaining: 3.54s
249:	learn: 0.0030941	total: 28.5s	remaining: 3.42s
250:	learn: 0.0030941	total: 28.6s	remaining: 3.31s
251:	learn: 0.0030941	total: 28.7s	remaining: 3.19s
252:	learn: 0.0030941	total: 28.8s	remaining: 3.07s
253:	learn: 0.0030941	total: 28.8s	remaining: 2.95s
254:	learn: 0.0030941	total: 28.9s	remaining: 2.84s
255:	learn: 0.0030941	total: 29s	remaining: 2.72s
256:	learn: 0.0030786	total: 29.1s	remaining: 2.6s
257:	learn: 0.0030637	total: 29.2s	remaining: 2.49s
258:	learn: 0.0030637	total: 29.3s	remaining: 2.37s
259:	learn: 0.0030426	total: 29.4s	remaining: 2.26s
260:	learn: 0.0030159	total: 29.5s	remaining: 2.15s
261:	learn: 0.0030159	total: 29.5s	remaining: 2.03s
262:	learn: 0.0030159	total: 29.6s	remaining: 1.91s
263:	learn: 0.0030159	total: 29.7s	remaining: 1.8s
264:	learn: 0.0030159	total: 29.8s	remaining: 1.68s
265:	learn: 0.0030160	total: 29.8s	remaining: 1.57s
266:	learn: 0.0030159	total: 29.9s	remaining: 1.46s
267:	learn: 0.0030159	total: 30s	remaining: 1.34s
268:	learn: 0.0030159	total: 30s	remaining: 1.23s
269:	learn: 0.0030159	total: 30.1s	remaining: 1.11s
270:	learn: 0.0030159	total: 30.2s	remaining: 1s
271:	learn: 0.0030159	total: 30.3s	remaining: 890ms
272:	learn: 0.0030159	total: 30.3s	remaining: 778ms
273:	learn: 0.0030159	total: 30.4s	remaining: 666ms
274:	learn: 0.0030159	total: 30.5s	remaining: 555ms
275:	learn: 0.0030159	total: 30.6s	remaining: 443ms
276:	learn: 0.0030159	total: 30.6s	remaining: 332ms
277:	learn: 0.0030159	total: 30.7s	remaining: 221ms
278:	learn: 0.0030159	total: 30.8s	remaining: 110ms
279:	learn: 0.0030159	total: 30.9s	remaining: 0us
[I 2024-12-19 15:07:25,729] Trial 44 finished with value: 78.83109626445729 and parameters: {'learning_rate': 0.07095231087224999, 'max_depth': 6, 'n_estimators': 280, 'scale_pos_weight': 6.767127395574064}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 25.59
 - Recall_Test: 85.71
 - AUPRC_Test: 78.97
 - Accuracy_Test: 99.56
 - F1-Score_Test: 39.42
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 280
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.07
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.77
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 78.8311

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5047455	total: 102ms	remaining: 28.9s
1:	learn: 0.3434208	total: 195ms	remaining: 27.7s
2:	learn: 0.2526260	total: 283ms	remaining: 26.7s
3:	learn: 0.1916451	total: 402ms	remaining: 28.3s
4:	learn: 0.1527306	total: 492ms	remaining: 27.7s
5:	learn: 0.1240348	total: 584ms	remaining: 27.3s
6:	learn: 0.1027764	total: 709ms	remaining: 28.3s
7:	learn: 0.0855107	total: 800ms	remaining: 27.8s
8:	learn: 0.0767368	total: 893ms	remaining: 27.5s
9:	learn: 0.0700835	total: 1.01s	remaining: 28s
10:	learn: 0.0646143	total: 1.12s	remaining: 28s
11:	learn: 0.0591294	total: 1.21s	remaining: 27.6s
12:	learn: 0.0540586	total: 1.33s	remaining: 27.9s
13:	learn: 0.0505253	total: 1.42s	remaining: 27.5s
14:	learn: 0.0478824	total: 1.51s	remaining: 27.3s
15:	learn: 0.0448565	total: 1.63s	remaining: 27.5s
16:	learn: 0.0424269	total: 1.73s	remaining: 27.4s
17:	learn: 0.0399841	total: 1.81s	remaining: 27s
18:	learn: 0.0383447	total: 1.93s	remaining: 27.1s
19:	learn: 0.0366187	total: 2.02s	remaining: 26.8s
20:	learn: 0.0356817	total: 2.11s	remaining: 26.6s
21:	learn: 0.0343085	total: 2.24s	remaining: 26.9s
22:	learn: 0.0328491	total: 2.33s	remaining: 26.6s
23:	learn: 0.0311488	total: 2.42s	remaining: 26.4s
24:	learn: 0.0297834	total: 2.54s	remaining: 26.5s
25:	learn: 0.0289289	total: 2.67s	remaining: 26.7s
26:	learn: 0.0276345	total: 2.78s	remaining: 26.7s
27:	learn: 0.0266041	total: 2.88s	remaining: 26.6s
28:	learn: 0.0256003	total: 3.02s	remaining: 26.8s
29:	learn: 0.0250212	total: 3.16s	remaining: 27s
30:	learn: 0.0239313	total: 3.37s	remaining: 27.7s
31:	learn: 0.0230425	total: 3.52s	remaining: 28s
32:	learn: 0.0222209	total: 3.73s	remaining: 28.6s
33:	learn: 0.0215094	total: 3.9s	remaining: 28.9s
34:	learn: 0.0208454	total: 4.08s	remaining: 29.2s
35:	learn: 0.0200619	total: 4.26s	remaining: 29.6s
36:	learn: 0.0193740	total: 4.45s	remaining: 30s
37:	learn: 0.0186568	total: 4.63s	remaining: 30.2s
38:	learn: 0.0182311	total: 4.8s	remaining: 30.4s
39:	learn: 0.0175958	total: 4.97s	remaining: 30.6s
40:	learn: 0.0171863	total: 5.14s	remaining: 30.7s
41:	learn: 0.0167948	total: 5.33s	remaining: 31s
42:	learn: 0.0161388	total: 5.51s	remaining: 31.2s
43:	learn: 0.0157660	total: 5.68s	remaining: 31.2s
44:	learn: 0.0155646	total: 5.85s	remaining: 31.3s
45:	learn: 0.0151993	total: 6.03s	remaining: 31.5s
46:	learn: 0.0148322	total: 6.21s	remaining: 31.6s
47:	learn: 0.0143190	total: 6.41s	remaining: 31.8s
48:	learn: 0.0138610	total: 6.6s	remaining: 31.9s
49:	learn: 0.0135444	total: 6.77s	remaining: 31.9s
50:	learn: 0.0131993	total: 6.94s	remaining: 32s
51:	learn: 0.0127584	total: 7.1s	remaining: 32s
52:	learn: 0.0124631	total: 7.3s	remaining: 32.1s
53:	learn: 0.0122381	total: 7.46s	remaining: 32.1s
54:	learn: 0.0120722	total: 7.62s	remaining: 32s
55:	learn: 0.0119013	total: 7.79s	remaining: 32s
56:	learn: 0.0116050	total: 7.99s	remaining: 32.1s
57:	learn: 0.0113383	total: 8.16s	remaining: 32.1s
58:	learn: 0.0111698	total: 8.33s	remaining: 32s
59:	learn: 0.0108663	total: 8.5s	remaining: 32s
60:	learn: 0.0106437	total: 8.67s	remaining: 32s
61:	learn: 0.0104850	total: 8.85s	remaining: 32s
62:	learn: 0.0102801	total: 9.02s	remaining: 31.9s
63:	learn: 0.0099579	total: 9.11s	remaining: 31.6s
64:	learn: 0.0097165	total: 9.2s	remaining: 31.3s
65:	learn: 0.0094399	total: 9.33s	remaining: 31.1s
66:	learn: 0.0092753	total: 9.41s	remaining: 30.8s
67:	learn: 0.0091519	total: 9.5s	remaining: 30.4s
68:	learn: 0.0089703	total: 9.63s	remaining: 30.3s
69:	learn: 0.0088143	total: 9.71s	remaining: 30s
70:	learn: 0.0087258	total: 9.8s	remaining: 29.7s
71:	learn: 0.0085425	total: 9.91s	remaining: 29.5s
72:	learn: 0.0084180	total: 10s	remaining: 29.2s
73:	learn: 0.0082587	total: 10.1s	remaining: 28.9s
74:	learn: 0.0081223	total: 10.2s	remaining: 28.8s
75:	learn: 0.0078807	total: 10.3s	remaining: 28.6s
76:	learn: 0.0077129	total: 10.4s	remaining: 28.3s
77:	learn: 0.0075922	total: 10.5s	remaining: 28.1s
78:	learn: 0.0074684	total: 10.7s	remaining: 27.9s
79:	learn: 0.0073422	total: 10.7s	remaining: 27.7s
80:	learn: 0.0072133	total: 10.9s	remaining: 27.5s
81:	learn: 0.0071562	total: 11s	remaining: 27.3s
82:	learn: 0.0070305	total: 11.1s	remaining: 27.1s
83:	learn: 0.0069143	total: 11.2s	remaining: 26.8s
84:	learn: 0.0068592	total: 11.3s	remaining: 26.6s
85:	learn: 0.0067139	total: 11.3s	remaining: 26.4s
86:	learn: 0.0066356	total: 11.4s	remaining: 26.1s
87:	learn: 0.0065060	total: 11.5s	remaining: 26s
88:	learn: 0.0064265	total: 11.6s	remaining: 25.8s
89:	learn: 0.0063470	total: 11.7s	remaining: 25.5s
90:	learn: 0.0062276	total: 11.8s	remaining: 25.4s
91:	learn: 0.0060863	total: 11.9s	remaining: 25.2s
92:	learn: 0.0059976	total: 12s	remaining: 25s
93:	learn: 0.0059021	total: 12.2s	remaining: 24.8s
94:	learn: 0.0057771	total: 12.2s	remaining: 24.6s
95:	learn: 0.0056790	total: 12.3s	remaining: 24.4s
96:	learn: 0.0055608	total: 12.4s	remaining: 24.2s
97:	learn: 0.0054867	total: 12.5s	remaining: 24s
98:	learn: 0.0054373	total: 12.6s	remaining: 23.8s
99:	learn: 0.0053506	total: 12.7s	remaining: 23.7s
100:	learn: 0.0053011	total: 12.8s	remaining: 23.5s
101:	learn: 0.0052595	total: 12.9s	remaining: 23.3s
102:	learn: 0.0052258	total: 13s	remaining: 23.1s
103:	learn: 0.0051102	total: 13.1s	remaining: 22.9s
104:	learn: 0.0050748	total: 13.2s	remaining: 22.7s
105:	learn: 0.0049836	total: 13.3s	remaining: 22.6s
106:	learn: 0.0048653	total: 13.4s	remaining: 22.4s
107:	learn: 0.0048074	total: 13.5s	remaining: 22.2s
108:	learn: 0.0047469	total: 13.6s	remaining: 22.1s
109:	learn: 0.0047210	total: 13.7s	remaining: 21.9s
110:	learn: 0.0046645	total: 13.8s	remaining: 21.7s
111:	learn: 0.0046644	total: 13.9s	remaining: 21.6s
112:	learn: 0.0046446	total: 14s	remaining: 21.4s
113:	learn: 0.0046014	total: 14.1s	remaining: 21.2s
114:	learn: 0.0044549	total: 14.2s	remaining: 21.1s
115:	learn: 0.0044162	total: 14.3s	remaining: 20.9s
116:	learn: 0.0043792	total: 14.3s	remaining: 20.7s
117:	learn: 0.0042863	total: 14.5s	remaining: 20.6s
118:	learn: 0.0042357	total: 14.5s	remaining: 20.4s
119:	learn: 0.0042008	total: 14.6s	remaining: 20.2s
120:	learn: 0.0041562	total: 14.8s	remaining: 20.2s
121:	learn: 0.0040942	total: 14.9s	remaining: 20s
122:	learn: 0.0040942	total: 14.9s	remaining: 19.8s
123:	learn: 0.0040491	total: 15s	remaining: 19.7s
124:	learn: 0.0039802	total: 15.1s	remaining: 19.5s
125:	learn: 0.0039684	total: 15.2s	remaining: 19.3s
126:	learn: 0.0039420	total: 15.3s	remaining: 19.2s
127:	learn: 0.0038988	total: 15.4s	remaining: 19s
128:	learn: 0.0038544	total: 15.5s	remaining: 18.9s
129:	learn: 0.0038475	total: 15.6s	remaining: 18.7s
130:	learn: 0.0038200	total: 15.7s	remaining: 18.6s
131:	learn: 0.0038005	total: 15.8s	remaining: 18.4s
132:	learn: 0.0037558	total: 15.9s	remaining: 18.3s
133:	learn: 0.0037558	total: 16s	remaining: 18.1s
134:	learn: 0.0036916	total: 16.1s	remaining: 18s
135:	learn: 0.0036916	total: 16.2s	remaining: 17.8s
136:	learn: 0.0036416	total: 16.2s	remaining: 17.7s
137:	learn: 0.0036212	total: 16.3s	remaining: 17.5s
138:	learn: 0.0035807	total: 16.4s	remaining: 17.4s
139:	learn: 0.0035662	total: 16.5s	remaining: 17.2s
140:	learn: 0.0035434	total: 16.6s	remaining: 17.1s
141:	learn: 0.0035433	total: 16.7s	remaining: 16.9s
142:	learn: 0.0035433	total: 16.8s	remaining: 16.8s
143:	learn: 0.0035149	total: 16.9s	remaining: 16.6s
144:	learn: 0.0035097	total: 17s	remaining: 16.5s
145:	learn: 0.0034665	total: 17.1s	remaining: 16.4s
146:	learn: 0.0034665	total: 17.1s	remaining: 16.2s
147:	learn: 0.0034333	total: 17.2s	remaining: 16.1s
148:	learn: 0.0033962	total: 17.3s	remaining: 15.9s
149:	learn: 0.0033884	total: 17.4s	remaining: 15.8s
150:	learn: 0.0033657	total: 17.5s	remaining: 15.7s
151:	learn: 0.0033657	total: 17.6s	remaining: 15.5s
152:	learn: 0.0032995	total: 17.7s	remaining: 15.4s
153:	learn: 0.0032995	total: 17.8s	remaining: 15.3s
154:	learn: 0.0032952	total: 17.9s	remaining: 15.1s
155:	learn: 0.0032865	total: 18s	remaining: 15s
156:	learn: 0.0032569	total: 18.1s	remaining: 14.8s
157:	learn: 0.0032569	total: 18.1s	remaining: 14.7s
158:	learn: 0.0032500	total: 18.2s	remaining: 14.5s
159:	learn: 0.0032257	total: 18.3s	remaining: 14.4s
160:	learn: 0.0032257	total: 18.4s	remaining: 14.3s
161:	learn: 0.0032257	total: 18.5s	remaining: 14.1s
162:	learn: 0.0032257	total: 18.5s	remaining: 14s
163:	learn: 0.0031973	total: 18.6s	remaining: 13.9s
164:	learn: 0.0031973	total: 18.7s	remaining: 13.7s
165:	learn: 0.0031858	total: 18.8s	remaining: 13.6s
166:	learn: 0.0031725	total: 18.9s	remaining: 13.5s
167:	learn: 0.0031724	total: 19.1s	remaining: 13.4s
168:	learn: 0.0031575	total: 19.2s	remaining: 13.3s
169:	learn: 0.0031575	total: 19.4s	remaining: 13.2s
170:	learn: 0.0031574	total: 19.5s	remaining: 13.1s
171:	learn: 0.0031340	total: 19.7s	remaining: 13.1s
172:	learn: 0.0031138	total: 19.8s	remaining: 12.9s
173:	learn: 0.0030905	total: 20s	remaining: 12.9s
174:	learn: 0.0030905	total: 20.1s	remaining: 12.8s
175:	learn: 0.0030905	total: 20.3s	remaining: 12.7s
176:	learn: 0.0030905	total: 20.4s	remaining: 12.6s
177:	learn: 0.0030905	total: 20.6s	remaining: 12.5s
178:	learn: 0.0030905	total: 20.7s	remaining: 12.4s
179:	learn: 0.0030905	total: 20.9s	remaining: 12.3s
180:	learn: 0.0030844	total: 21s	remaining: 12.2s
181:	learn: 0.0030493	total: 21.2s	remaining: 12.1s
182:	learn: 0.0030493	total: 21.3s	remaining: 12s
183:	learn: 0.0030493	total: 21.5s	remaining: 11.9s
184:	learn: 0.0030493	total: 21.6s	remaining: 11.8s
185:	learn: 0.0030493	total: 21.8s	remaining: 11.7s
186:	learn: 0.0030493	total: 21.9s	remaining: 11.6s
187:	learn: 0.0030493	total: 22s	remaining: 11.5s
188:	learn: 0.0030493	total: 22.2s	remaining: 11.4s
189:	learn: 0.0030493	total: 22.3s	remaining: 11.3s
190:	learn: 0.0030493	total: 22.4s	remaining: 11.2s
191:	learn: 0.0030493	total: 22.6s	remaining: 11.1s
192:	learn: 0.0030493	total: 22.7s	remaining: 11s
193:	learn: 0.0030493	total: 22.9s	remaining: 10.8s
194:	learn: 0.0030493	total: 23s	remaining: 10.7s
195:	learn: 0.0030493	total: 23.1s	remaining: 10.6s
196:	learn: 0.0030493	total: 23.3s	remaining: 10.5s
197:	learn: 0.0030493	total: 23.4s	remaining: 10.4s
198:	learn: 0.0030493	total: 23.5s	remaining: 10.3s
199:	learn: 0.0030493	total: 23.7s	remaining: 10.2s
200:	learn: 0.0030493	total: 23.8s	remaining: 10.1s
201:	learn: 0.0030493	total: 23.9s	remaining: 9.96s
202:	learn: 0.0030493	total: 24.1s	remaining: 9.85s
203:	learn: 0.0030493	total: 24.2s	remaining: 9.74s
204:	learn: 0.0030493	total: 24.4s	remaining: 9.62s
205:	learn: 0.0030493	total: 24.5s	remaining: 9.51s
206:	learn: 0.0030493	total: 24.6s	remaining: 9.39s
207:	learn: 0.0030493	total: 24.7s	remaining: 9.28s
208:	learn: 0.0030493	total: 24.9s	remaining: 9.16s
209:	learn: 0.0030493	total: 25s	remaining: 9.04s
210:	learn: 0.0030493	total: 25.1s	remaining: 8.93s
211:	learn: 0.0030493	total: 25.3s	remaining: 8.81s
212:	learn: 0.0030493	total: 25.3s	remaining: 8.68s
213:	learn: 0.0030493	total: 25.4s	remaining: 8.54s
214:	learn: 0.0030493	total: 25.5s	remaining: 8.41s
215:	learn: 0.0030493	total: 25.6s	remaining: 8.29s
216:	learn: 0.0030493	total: 25.6s	remaining: 8.15s
217:	learn: 0.0030493	total: 25.7s	remaining: 8.03s
218:	learn: 0.0030493	total: 25.8s	remaining: 7.89s
219:	learn: 0.0030493	total: 25.9s	remaining: 7.76s
220:	learn: 0.0030493	total: 25.9s	remaining: 7.63s
221:	learn: 0.0030493	total: 26s	remaining: 7.5s
222:	learn: 0.0030493	total: 26.1s	remaining: 7.37s
223:	learn: 0.0030493	total: 26.2s	remaining: 7.25s
224:	learn: 0.0030493	total: 26.3s	remaining: 7.12s
225:	learn: 0.0030493	total: 26.3s	remaining: 6.99s
226:	learn: 0.0030493	total: 26.4s	remaining: 6.87s
227:	learn: 0.0030493	total: 26.5s	remaining: 6.74s
228:	learn: 0.0030493	total: 26.6s	remaining: 6.61s
229:	learn: 0.0030493	total: 26.6s	remaining: 6.49s
230:	learn: 0.0030493	total: 26.7s	remaining: 6.36s
231:	learn: 0.0030493	total: 26.8s	remaining: 6.24s
232:	learn: 0.0030493	total: 26.9s	remaining: 6.11s
233:	learn: 0.0030493	total: 26.9s	remaining: 5.99s
234:	learn: 0.0030493	total: 27s	remaining: 5.86s
235:	learn: 0.0030493	total: 27.1s	remaining: 5.74s
236:	learn: 0.0030493	total: 27.2s	remaining: 5.62s
237:	learn: 0.0030493	total: 27.2s	remaining: 5.49s
238:	learn: 0.0030493	total: 27.3s	remaining: 5.38s
239:	learn: 0.0030493	total: 27.4s	remaining: 5.25s
240:	learn: 0.0030493	total: 27.5s	remaining: 5.13s
241:	learn: 0.0030493	total: 27.6s	remaining: 5.01s
242:	learn: 0.0030493	total: 27.6s	remaining: 4.89s
243:	learn: 0.0030493	total: 27.7s	remaining: 4.77s
244:	learn: 0.0030493	total: 27.8s	remaining: 4.65s
245:	learn: 0.0030493	total: 27.9s	remaining: 4.53s
246:	learn: 0.0030493	total: 27.9s	remaining: 4.41s
247:	learn: 0.0030493	total: 28s	remaining: 4.29s
248:	learn: 0.0030493	total: 28.1s	remaining: 4.18s
249:	learn: 0.0030493	total: 28.2s	remaining: 4.06s
250:	learn: 0.0030493	total: 28.3s	remaining: 3.95s
251:	learn: 0.0030493	total: 28.4s	remaining: 3.83s
252:	learn: 0.0030493	total: 28.5s	remaining: 3.71s
253:	learn: 0.0030493	total: 28.6s	remaining: 3.6s
254:	learn: 0.0030493	total: 28.6s	remaining: 3.48s
255:	learn: 0.0030493	total: 28.7s	remaining: 3.36s
256:	learn: 0.0030493	total: 28.8s	remaining: 3.25s
257:	learn: 0.0030493	total: 28.9s	remaining: 3.13s
258:	learn: 0.0030493	total: 28.9s	remaining: 3.02s
259:	learn: 0.0030493	total: 29s	remaining: 2.9s
260:	learn: 0.0030493	total: 29.1s	remaining: 2.79s
261:	learn: 0.0030493	total: 29.2s	remaining: 2.67s
262:	learn: 0.0030493	total: 29.2s	remaining: 2.56s
263:	learn: 0.0030493	total: 29.3s	remaining: 2.44s
264:	learn: 0.0030493	total: 29.4s	remaining: 2.33s
265:	learn: 0.0030493	total: 29.5s	remaining: 2.22s
266:	learn: 0.0030493	total: 29.6s	remaining: 2.1s
267:	learn: 0.0030493	total: 29.6s	remaining: 1.99s
268:	learn: 0.0030493	total: 29.7s	remaining: 1.88s
269:	learn: 0.0030493	total: 29.8s	remaining: 1.77s
270:	learn: 0.0030493	total: 29.9s	remaining: 1.65s
271:	learn: 0.0030493	total: 30s	remaining: 1.54s
272:	learn: 0.0030493	total: 30.1s	remaining: 1.43s
273:	learn: 0.0030493	total: 30.1s	remaining: 1.32s
274:	learn: 0.0030493	total: 30.2s	remaining: 1.21s
275:	learn: 0.0030493	total: 30.3s	remaining: 1.1s
276:	learn: 0.0030493	total: 30.4s	remaining: 986ms
277:	learn: 0.0030493	total: 30.5s	remaining: 877ms
278:	learn: 0.0030493	total: 30.5s	remaining: 766ms
279:	learn: 0.0030493	total: 30.6s	remaining: 656ms
280:	learn: 0.0030493	total: 30.7s	remaining: 546ms
281:	learn: 0.0030493	total: 30.8s	remaining: 436ms
282:	learn: 0.0030493	total: 30.8s	remaining: 327ms
283:	learn: 0.0030493	total: 30.9s	remaining: 218ms
284:	learn: 0.0030493	total: 31s	remaining: 109ms
285:	learn: 0.0030493	total: 31.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.72
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.86
 - F1-Score_Train: 99.86
 - Precision_Test: 26.10
 - Recall_Test: 84.92
 - AUPRC_Test: 79.93
 - Accuracy_Test: 99.57
 - F1-Score_Test: 39.93
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.78
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5067035	total: 94.7ms	remaining: 27s
1:	learn: 0.3570630	total: 198ms	remaining: 28.1s
2:	learn: 0.2707112	total: 292ms	remaining: 27.6s
3:	learn: 0.2134189	total: 442ms	remaining: 31.1s
4:	learn: 0.1704790	total: 536ms	remaining: 30.1s
5:	learn: 0.1441278	total: 626ms	remaining: 29.2s
6:	learn: 0.1282181	total: 746ms	remaining: 29.7s
7:	learn: 0.1175642	total: 839ms	remaining: 29.2s
8:	learn: 0.1059734	total: 923ms	remaining: 28.4s
9:	learn: 0.0981932	total: 1.04s	remaining: 28.7s
10:	learn: 0.0897532	total: 1.13s	remaining: 28.3s
11:	learn: 0.0836782	total: 1.24s	remaining: 28.3s
12:	learn: 0.0770174	total: 1.36s	remaining: 28.6s
13:	learn: 0.0720475	total: 1.45s	remaining: 28.2s
14:	learn: 0.0686350	total: 1.53s	remaining: 27.7s
15:	learn: 0.0640045	total: 1.66s	remaining: 27.9s
16:	learn: 0.0600916	total: 1.75s	remaining: 27.7s
17:	learn: 0.0577658	total: 1.84s	remaining: 27.5s
18:	learn: 0.0558230	total: 1.99s	remaining: 28s
19:	learn: 0.0533615	total: 2.14s	remaining: 28.4s
20:	learn: 0.0511833	total: 2.33s	remaining: 29.3s
21:	learn: 0.0487982	total: 2.5s	remaining: 29.9s
22:	learn: 0.0466417	total: 2.68s	remaining: 30.7s
23:	learn: 0.0454093	total: 2.86s	remaining: 31.3s
24:	learn: 0.0435283	total: 3.05s	remaining: 31.8s
25:	learn: 0.0419880	total: 3.22s	remaining: 32.2s
26:	learn: 0.0405944	total: 3.41s	remaining: 32.8s
27:	learn: 0.0391904	total: 3.6s	remaining: 33.2s
28:	learn: 0.0378051	total: 3.77s	remaining: 33.5s
29:	learn: 0.0362385	total: 3.94s	remaining: 33.7s
30:	learn: 0.0352135	total: 4.13s	remaining: 33.9s
31:	learn: 0.0338465	total: 4.28s	remaining: 34s
32:	learn: 0.0331133	total: 4.46s	remaining: 34.2s
33:	learn: 0.0321624	total: 4.64s	remaining: 34.4s
34:	learn: 0.0313327	total: 4.81s	remaining: 34.5s
35:	learn: 0.0306083	total: 4.98s	remaining: 34.6s
36:	learn: 0.0298697	total: 5.15s	remaining: 34.7s
37:	learn: 0.0290383	total: 5.33s	remaining: 34.8s
38:	learn: 0.0279721	total: 5.5s	remaining: 34.9s
39:	learn: 0.0270273	total: 5.68s	remaining: 34.9s
40:	learn: 0.0263251	total: 5.86s	remaining: 35s
41:	learn: 0.0256163	total: 6.02s	remaining: 35s
42:	learn: 0.0249643	total: 6.19s	remaining: 35s
43:	learn: 0.0239225	total: 6.36s	remaining: 35s
44:	learn: 0.0234931	total: 6.56s	remaining: 35.1s
45:	learn: 0.0229735	total: 6.71s	remaining: 35s
46:	learn: 0.0223884	total: 6.9s	remaining: 35.1s
47:	learn: 0.0219041	total: 7.05s	remaining: 35s
48:	learn: 0.0214285	total: 7.23s	remaining: 35s
49:	learn: 0.0209172	total: 7.39s	remaining: 34.9s
50:	learn: 0.0204612	total: 7.55s	remaining: 34.8s
51:	learn: 0.0201091	total: 7.72s	remaining: 34.7s
52:	learn: 0.0197842	total: 7.88s	remaining: 34.7s
53:	learn: 0.0194294	total: 7.97s	remaining: 34.3s
54:	learn: 0.0190132	total: 8.06s	remaining: 33.9s
55:	learn: 0.0184094	total: 8.19s	remaining: 33.6s
56:	learn: 0.0180823	total: 8.27s	remaining: 33.2s
57:	learn: 0.0176520	total: 8.35s	remaining: 32.8s
58:	learn: 0.0174041	total: 8.46s	remaining: 32.6s
59:	learn: 0.0171233	total: 8.55s	remaining: 32.2s
60:	learn: 0.0168515	total: 8.63s	remaining: 31.8s
61:	learn: 0.0163730	total: 8.76s	remaining: 31.6s
62:	learn: 0.0161948	total: 8.84s	remaining: 31.3s
63:	learn: 0.0157554	total: 8.93s	remaining: 31s
64:	learn: 0.0155293	total: 9.04s	remaining: 30.7s
65:	learn: 0.0151817	total: 9.12s	remaining: 30.4s
66:	learn: 0.0148837	total: 9.21s	remaining: 30.1s
67:	learn: 0.0146237	total: 9.33s	remaining: 29.9s
68:	learn: 0.0144441	total: 9.41s	remaining: 29.6s
69:	learn: 0.0141881	total: 9.49s	remaining: 29.3s
70:	learn: 0.0137943	total: 9.61s	remaining: 29.1s
71:	learn: 0.0134263	total: 9.72s	remaining: 28.9s
72:	learn: 0.0130518	total: 9.81s	remaining: 28.6s
73:	learn: 0.0128240	total: 9.92s	remaining: 28.4s
74:	learn: 0.0125205	total: 10s	remaining: 28.2s
75:	learn: 0.0123348	total: 10.1s	remaining: 27.9s
76:	learn: 0.0120542	total: 10.2s	remaining: 27.7s
77:	learn: 0.0118873	total: 10.3s	remaining: 27.5s
78:	learn: 0.0116783	total: 10.4s	remaining: 27.2s
79:	learn: 0.0115015	total: 10.5s	remaining: 27.1s
80:	learn: 0.0113344	total: 10.6s	remaining: 26.9s
81:	learn: 0.0110865	total: 10.7s	remaining: 26.7s
82:	learn: 0.0107541	total: 10.8s	remaining: 26.5s
83:	learn: 0.0106161	total: 10.9s	remaining: 26.3s
84:	learn: 0.0104002	total: 11s	remaining: 26s
85:	learn: 0.0101381	total: 11.1s	remaining: 25.9s
86:	learn: 0.0099529	total: 11.2s	remaining: 25.6s
87:	learn: 0.0097513	total: 11.3s	remaining: 25.4s
88:	learn: 0.0094684	total: 11.4s	remaining: 25.2s
89:	learn: 0.0094069	total: 11.5s	remaining: 25s
90:	learn: 0.0093389	total: 11.6s	remaining: 24.8s
91:	learn: 0.0092548	total: 11.7s	remaining: 24.6s
92:	learn: 0.0090053	total: 11.8s	remaining: 24.4s
93:	learn: 0.0089277	total: 11.9s	remaining: 24.2s
94:	learn: 0.0088391	total: 12s	remaining: 24.1s
95:	learn: 0.0088010	total: 12.1s	remaining: 23.9s
96:	learn: 0.0086762	total: 12.1s	remaining: 23.7s
97:	learn: 0.0085633	total: 12.3s	remaining: 23.5s
98:	learn: 0.0084171	total: 12.3s	remaining: 23.3s
99:	learn: 0.0083186	total: 12.4s	remaining: 23.1s
100:	learn: 0.0082249	total: 12.6s	remaining: 23s
101:	learn: 0.0080556	total: 12.6s	remaining: 22.8s
102:	learn: 0.0079148	total: 12.7s	remaining: 22.6s
103:	learn: 0.0078226	total: 12.9s	remaining: 22.5s
104:	learn: 0.0077204	total: 12.9s	remaining: 22.3s
105:	learn: 0.0076773	total: 13s	remaining: 22.1s
106:	learn: 0.0075705	total: 13.1s	remaining: 22s
107:	learn: 0.0075387	total: 13.2s	remaining: 21.8s
108:	learn: 0.0074502	total: 13.3s	remaining: 21.6s
109:	learn: 0.0072757	total: 13.4s	remaining: 21.5s
110:	learn: 0.0072310	total: 13.5s	remaining: 21.3s
111:	learn: 0.0070609	total: 13.6s	remaining: 21.1s
112:	learn: 0.0070030	total: 13.7s	remaining: 21s
113:	learn: 0.0069523	total: 13.8s	remaining: 20.8s
114:	learn: 0.0068856	total: 13.9s	remaining: 20.6s
115:	learn: 0.0068015	total: 14s	remaining: 20.5s
116:	learn: 0.0066977	total: 14.1s	remaining: 20.3s
117:	learn: 0.0065904	total: 14.2s	remaining: 20.2s
118:	learn: 0.0064805	total: 14.3s	remaining: 20s
119:	learn: 0.0063868	total: 14.4s	remaining: 19.9s
120:	learn: 0.0062940	total: 14.4s	remaining: 19.7s
121:	learn: 0.0061869	total: 14.6s	remaining: 19.6s
122:	learn: 0.0061100	total: 14.7s	remaining: 19.4s
123:	learn: 0.0060209	total: 14.8s	remaining: 19.3s
124:	learn: 0.0059726	total: 14.9s	remaining: 19.2s
125:	learn: 0.0059269	total: 15s	remaining: 19s
126:	learn: 0.0058861	total: 15.1s	remaining: 18.9s
127:	learn: 0.0057999	total: 15.2s	remaining: 18.7s
128:	learn: 0.0057471	total: 15.3s	remaining: 18.6s
129:	learn: 0.0056763	total: 15.3s	remaining: 18.4s
130:	learn: 0.0056316	total: 15.5s	remaining: 18.3s
131:	learn: 0.0055660	total: 15.5s	remaining: 18.1s
132:	learn: 0.0055039	total: 15.6s	remaining: 18s
133:	learn: 0.0054201	total: 15.7s	remaining: 17.9s
134:	learn: 0.0053440	total: 15.8s	remaining: 17.7s
135:	learn: 0.0052703	total: 15.9s	remaining: 17.6s
136:	learn: 0.0052415	total: 16s	remaining: 17.4s
137:	learn: 0.0051683	total: 16.1s	remaining: 17.3s
138:	learn: 0.0050905	total: 16.2s	remaining: 17.1s
139:	learn: 0.0050506	total: 16.3s	remaining: 17s
140:	learn: 0.0049852	total: 16.4s	remaining: 16.9s
141:	learn: 0.0049360	total: 16.5s	remaining: 16.7s
142:	learn: 0.0048934	total: 16.6s	remaining: 16.6s
143:	learn: 0.0048194	total: 16.7s	remaining: 16.4s
144:	learn: 0.0047364	total: 16.8s	remaining: 16.3s
145:	learn: 0.0047364	total: 16.9s	remaining: 16.2s
146:	learn: 0.0046910	total: 17s	remaining: 16s
147:	learn: 0.0046234	total: 17s	remaining: 15.9s
148:	learn: 0.0045861	total: 17.2s	remaining: 15.8s
149:	learn: 0.0045222	total: 17.3s	remaining: 15.6s
150:	learn: 0.0044588	total: 17.3s	remaining: 15.5s
151:	learn: 0.0044157	total: 17.4s	remaining: 15.4s
152:	learn: 0.0043816	total: 17.5s	remaining: 15.2s
153:	learn: 0.0043510	total: 17.6s	remaining: 15.1s
154:	learn: 0.0043231	total: 17.7s	remaining: 15s
155:	learn: 0.0042778	total: 17.8s	remaining: 14.8s
156:	learn: 0.0042396	total: 17.9s	remaining: 14.7s
157:	learn: 0.0041690	total: 18.1s	remaining: 14.6s
158:	learn: 0.0041290	total: 18.2s	remaining: 14.6s
159:	learn: 0.0040565	total: 18.4s	remaining: 14.5s
160:	learn: 0.0040386	total: 18.5s	remaining: 14.4s
161:	learn: 0.0039896	total: 18.7s	remaining: 14.3s
162:	learn: 0.0039641	total: 18.9s	remaining: 14.3s
163:	learn: 0.0039426	total: 19.1s	remaining: 14.2s
164:	learn: 0.0039426	total: 19.2s	remaining: 14.1s
165:	learn: 0.0039425	total: 19.3s	remaining: 14s
166:	learn: 0.0038946	total: 19.5s	remaining: 13.9s
167:	learn: 0.0038946	total: 19.6s	remaining: 13.8s
168:	learn: 0.0038791	total: 19.8s	remaining: 13.7s
169:	learn: 0.0038653	total: 19.9s	remaining: 13.6s
170:	learn: 0.0038518	total: 20.1s	remaining: 13.5s
171:	learn: 0.0038085	total: 20.3s	remaining: 13.4s
172:	learn: 0.0037432	total: 20.4s	remaining: 13.3s
173:	learn: 0.0037432	total: 20.6s	remaining: 13.2s
174:	learn: 0.0037196	total: 20.7s	remaining: 13.1s
175:	learn: 0.0036799	total: 20.9s	remaining: 13.1s
176:	learn: 0.0036369	total: 21.1s	remaining: 13s
177:	learn: 0.0036278	total: 21.3s	remaining: 12.9s
178:	learn: 0.0035992	total: 21.4s	remaining: 12.8s
179:	learn: 0.0035520	total: 21.6s	remaining: 12.7s
180:	learn: 0.0035389	total: 21.8s	remaining: 12.6s
181:	learn: 0.0034733	total: 21.9s	remaining: 12.5s
182:	learn: 0.0034548	total: 22.1s	remaining: 12.4s
183:	learn: 0.0034277	total: 22.3s	remaining: 12.4s
184:	learn: 0.0034276	total: 22.4s	remaining: 12.2s
185:	learn: 0.0034275	total: 22.6s	remaining: 12.1s
186:	learn: 0.0033839	total: 22.7s	remaining: 12s
187:	learn: 0.0033688	total: 22.9s	remaining: 11.9s
188:	learn: 0.0033687	total: 23.1s	remaining: 11.8s
189:	learn: 0.0033687	total: 23.2s	remaining: 11.7s
190:	learn: 0.0033686	total: 23.3s	remaining: 11.6s
191:	learn: 0.0033551	total: 23.5s	remaining: 11.5s
192:	learn: 0.0033298	total: 23.6s	remaining: 11.4s
193:	learn: 0.0033058	total: 23.9s	remaining: 11.3s
194:	learn: 0.0032914	total: 24s	remaining: 11.2s
195:	learn: 0.0032476	total: 24s	remaining: 11s
196:	learn: 0.0032476	total: 24.1s	remaining: 10.9s
197:	learn: 0.0032476	total: 24.2s	remaining: 10.8s
198:	learn: 0.0032476	total: 24.3s	remaining: 10.6s
199:	learn: 0.0032476	total: 24.4s	remaining: 10.5s
200:	learn: 0.0032370	total: 24.5s	remaining: 10.3s
201:	learn: 0.0032370	total: 24.5s	remaining: 10.2s
202:	learn: 0.0032370	total: 24.6s	remaining: 10.1s
203:	learn: 0.0032370	total: 24.7s	remaining: 9.92s
204:	learn: 0.0032370	total: 24.8s	remaining: 9.78s
205:	learn: 0.0032370	total: 24.8s	remaining: 9.65s
206:	learn: 0.0032370	total: 24.9s	remaining: 9.51s
207:	learn: 0.0032369	total: 25s	remaining: 9.37s
208:	learn: 0.0032369	total: 25.1s	remaining: 9.24s
209:	learn: 0.0032369	total: 25.2s	remaining: 9.11s
210:	learn: 0.0032369	total: 25.2s	remaining: 8.97s
211:	learn: 0.0032369	total: 25.3s	remaining: 8.85s
212:	learn: 0.0032369	total: 25.4s	remaining: 8.71s
213:	learn: 0.0032369	total: 25.5s	remaining: 8.57s
214:	learn: 0.0032369	total: 25.6s	remaining: 8.44s
215:	learn: 0.0032369	total: 25.6s	remaining: 8.31s
216:	learn: 0.0032369	total: 25.7s	remaining: 8.18s
217:	learn: 0.0032369	total: 25.8s	remaining: 8.04s
218:	learn: 0.0032369	total: 25.9s	remaining: 7.93s
219:	learn: 0.0032369	total: 26s	remaining: 7.79s
220:	learn: 0.0032369	total: 26.1s	remaining: 7.67s
221:	learn: 0.0032369	total: 26.1s	remaining: 7.54s
222:	learn: 0.0032369	total: 26.2s	remaining: 7.41s
223:	learn: 0.0032369	total: 26.3s	remaining: 7.28s
224:	learn: 0.0032369	total: 26.4s	remaining: 7.16s
225:	learn: 0.0032369	total: 26.5s	remaining: 7.03s
226:	learn: 0.0032369	total: 26.6s	remaining: 6.9s
227:	learn: 0.0032369	total: 26.6s	remaining: 6.78s
228:	learn: 0.0032369	total: 26.7s	remaining: 6.65s
229:	learn: 0.0032369	total: 26.8s	remaining: 6.52s
230:	learn: 0.0032369	total: 26.9s	remaining: 6.4s
231:	learn: 0.0032369	total: 26.9s	remaining: 6.27s
232:	learn: 0.0032369	total: 27s	remaining: 6.15s
233:	learn: 0.0032368	total: 27.1s	remaining: 6.02s
234:	learn: 0.0032368	total: 27.2s	remaining: 5.9s
235:	learn: 0.0032368	total: 27.3s	remaining: 5.78s
236:	learn: 0.0032368	total: 27.3s	remaining: 5.65s
237:	learn: 0.0032368	total: 27.4s	remaining: 5.53s
238:	learn: 0.0032368	total: 27.5s	remaining: 5.41s
239:	learn: 0.0032368	total: 27.6s	remaining: 5.29s
240:	learn: 0.0032368	total: 27.7s	remaining: 5.17s
241:	learn: 0.0032368	total: 27.7s	remaining: 5.04s
242:	learn: 0.0032368	total: 27.8s	remaining: 4.92s
243:	learn: 0.0032368	total: 27.9s	remaining: 4.8s
244:	learn: 0.0032368	total: 28s	remaining: 4.68s
245:	learn: 0.0032367	total: 28.1s	remaining: 4.56s
246:	learn: 0.0032367	total: 28.1s	remaining: 4.44s
247:	learn: 0.0032367	total: 28.2s	remaining: 4.33s
248:	learn: 0.0032367	total: 28.3s	remaining: 4.21s
249:	learn: 0.0032367	total: 28.4s	remaining: 4.09s
250:	learn: 0.0032367	total: 28.5s	remaining: 3.97s
251:	learn: 0.0032367	total: 28.6s	remaining: 3.85s
252:	learn: 0.0032367	total: 28.6s	remaining: 3.73s
253:	learn: 0.0032367	total: 28.7s	remaining: 3.62s
254:	learn: 0.0032367	total: 28.8s	remaining: 3.5s
255:	learn: 0.0032367	total: 28.9s	remaining: 3.38s
256:	learn: 0.0032367	total: 28.9s	remaining: 3.27s
257:	learn: 0.0032367	total: 29s	remaining: 3.15s
258:	learn: 0.0032366	total: 29.1s	remaining: 3.03s
259:	learn: 0.0032367	total: 29.2s	remaining: 2.92s
260:	learn: 0.0032367	total: 29.3s	remaining: 2.81s
261:	learn: 0.0032367	total: 29.4s	remaining: 2.69s
262:	learn: 0.0032366	total: 29.5s	remaining: 2.58s
263:	learn: 0.0032366	total: 29.5s	remaining: 2.46s
264:	learn: 0.0032366	total: 29.6s	remaining: 2.35s
265:	learn: 0.0032366	total: 29.7s	remaining: 2.23s
266:	learn: 0.0032366	total: 29.8s	remaining: 2.12s
267:	learn: 0.0032366	total: 29.9s	remaining: 2.01s
268:	learn: 0.0032366	total: 30s	remaining: 1.89s
269:	learn: 0.0032366	total: 30.1s	remaining: 1.78s
270:	learn: 0.0032366	total: 30.1s	remaining: 1.67s
271:	learn: 0.0032366	total: 30.2s	remaining: 1.55s
272:	learn: 0.0032366	total: 30.3s	remaining: 1.44s
273:	learn: 0.0032366	total: 30.4s	remaining: 1.33s
274:	learn: 0.0032365	total: 30.4s	remaining: 1.22s
275:	learn: 0.0032365	total: 30.5s	remaining: 1.11s
276:	learn: 0.0032365	total: 30.6s	remaining: 995ms
277:	learn: 0.0032365	total: 30.7s	remaining: 884ms
278:	learn: 0.0032365	total: 30.8s	remaining: 772ms
279:	learn: 0.0032365	total: 30.8s	remaining: 661ms
280:	learn: 0.0032365	total: 30.9s	remaining: 550ms
281:	learn: 0.0032365	total: 31s	remaining: 440ms
282:	learn: 0.0032365	total: 31.1s	remaining: 330ms
283:	learn: 0.0032365	total: 31.2s	remaining: 220ms
284:	learn: 0.0032365	total: 31.3s	remaining: 110ms
285:	learn: 0.0032365	total: 31.3s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 31.70
 - Recall_Test: 87.30
 - AUPRC_Test: 79.10
 - Accuracy_Test: 99.66
 - F1-Score_Test: 46.51
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.78
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5075988	total: 83.6ms	remaining: 23.8s
1:	learn: 0.3627596	total: 188ms	remaining: 26.6s
2:	learn: 0.2796440	total: 272ms	remaining: 25.7s
3:	learn: 0.2045980	total: 415ms	remaining: 29.3s
4:	learn: 0.1676067	total: 563ms	remaining: 31.6s
5:	learn: 0.1421388	total: 731ms	remaining: 34.1s
6:	learn: 0.1164136	total: 900ms	remaining: 35.9s
7:	learn: 0.1058193	total: 1.06s	remaining: 37s
8:	learn: 0.0931967	total: 1.25s	remaining: 38.4s
9:	learn: 0.0851937	total: 1.41s	remaining: 38.9s
10:	learn: 0.0786859	total: 1.58s	remaining: 39.4s
11:	learn: 0.0723424	total: 1.76s	remaining: 40.3s
12:	learn: 0.0682814	total: 1.93s	remaining: 40.5s
13:	learn: 0.0641247	total: 2.1s	remaining: 40.9s
14:	learn: 0.0607192	total: 2.27s	remaining: 41s
15:	learn: 0.0573960	total: 2.45s	remaining: 41.3s
16:	learn: 0.0534042	total: 2.63s	remaining: 41.6s
17:	learn: 0.0503865	total: 2.79s	remaining: 41.6s
18:	learn: 0.0476270	total: 2.96s	remaining: 41.6s
19:	learn: 0.0453609	total: 3.14s	remaining: 41.8s
20:	learn: 0.0428116	total: 3.33s	remaining: 42s
21:	learn: 0.0415454	total: 3.49s	remaining: 41.9s
22:	learn: 0.0393987	total: 3.69s	remaining: 42.2s
23:	learn: 0.0381125	total: 3.88s	remaining: 42.4s
24:	learn: 0.0365041	total: 4.05s	remaining: 42.3s
25:	learn: 0.0358943	total: 4.2s	remaining: 42s
26:	learn: 0.0340750	total: 4.38s	remaining: 42s
27:	learn: 0.0331391	total: 4.53s	remaining: 41.7s
28:	learn: 0.0318247	total: 4.68s	remaining: 41.5s
29:	learn: 0.0309706	total: 4.86s	remaining: 41.5s
30:	learn: 0.0299034	total: 5.03s	remaining: 41.4s
31:	learn: 0.0292498	total: 5.2s	remaining: 41.3s
32:	learn: 0.0284512	total: 5.38s	remaining: 41.2s
33:	learn: 0.0275153	total: 5.59s	remaining: 41.4s
34:	learn: 0.0268916	total: 5.75s	remaining: 41.3s
35:	learn: 0.0261400	total: 5.93s	remaining: 41.2s
36:	learn: 0.0251140	total: 6.12s	remaining: 41.2s
37:	learn: 0.0238141	total: 6.25s	remaining: 40.8s
38:	learn: 0.0232037	total: 6.34s	remaining: 40.2s
39:	learn: 0.0226301	total: 6.44s	remaining: 39.6s
40:	learn: 0.0218739	total: 6.56s	remaining: 39.2s
41:	learn: 0.0208272	total: 6.65s	remaining: 38.6s
42:	learn: 0.0202986	total: 6.75s	remaining: 38.1s
43:	learn: 0.0197439	total: 6.91s	remaining: 38s
44:	learn: 0.0194332	total: 6.98s	remaining: 37.4s
45:	learn: 0.0191086	total: 7.06s	remaining: 36.9s
46:	learn: 0.0187720	total: 7.17s	remaining: 36.5s
47:	learn: 0.0181511	total: 7.26s	remaining: 36s
48:	learn: 0.0174906	total: 7.36s	remaining: 35.6s
49:	learn: 0.0170104	total: 7.49s	remaining: 35.3s
50:	learn: 0.0166661	total: 7.58s	remaining: 34.9s
51:	learn: 0.0161316	total: 7.66s	remaining: 34.5s
52:	learn: 0.0158798	total: 7.78s	remaining: 34.2s
53:	learn: 0.0154567	total: 7.86s	remaining: 33.8s
54:	learn: 0.0151765	total: 7.95s	remaining: 33.4s
55:	learn: 0.0149247	total: 8.06s	remaining: 33.1s
56:	learn: 0.0147468	total: 8.13s	remaining: 32.7s
57:	learn: 0.0144337	total: 8.21s	remaining: 32.3s
58:	learn: 0.0141262	total: 8.33s	remaining: 32.1s
59:	learn: 0.0138012	total: 8.42s	remaining: 31.7s
60:	learn: 0.0134472	total: 8.52s	remaining: 31.4s
61:	learn: 0.0132308	total: 8.65s	remaining: 31.2s
62:	learn: 0.0128783	total: 8.73s	remaining: 30.9s
63:	learn: 0.0125979	total: 8.82s	remaining: 30.6s
64:	learn: 0.0123239	total: 8.95s	remaining: 30.4s
65:	learn: 0.0121204	total: 9.03s	remaining: 30.1s
66:	learn: 0.0118222	total: 9.12s	remaining: 29.8s
67:	learn: 0.0115927	total: 9.23s	remaining: 29.6s
68:	learn: 0.0113170	total: 9.32s	remaining: 29.3s
69:	learn: 0.0110351	total: 9.41s	remaining: 29s
70:	learn: 0.0108862	total: 9.53s	remaining: 28.9s
71:	learn: 0.0107212	total: 9.63s	remaining: 28.6s
72:	learn: 0.0105389	total: 9.72s	remaining: 28.4s
73:	learn: 0.0103919	total: 9.83s	remaining: 28.2s
74:	learn: 0.0101952	total: 9.92s	remaining: 27.9s
75:	learn: 0.0099656	total: 10s	remaining: 27.7s
76:	learn: 0.0098505	total: 10.1s	remaining: 27.5s
77:	learn: 0.0096295	total: 10.2s	remaining: 27.2s
78:	learn: 0.0094487	total: 10.3s	remaining: 27s
79:	learn: 0.0092758	total: 10.4s	remaining: 26.8s
80:	learn: 0.0091428	total: 10.5s	remaining: 26.5s
81:	learn: 0.0090053	total: 10.6s	remaining: 26.4s
82:	learn: 0.0088368	total: 10.7s	remaining: 26.2s
83:	learn: 0.0086784	total: 10.8s	remaining: 26s
84:	learn: 0.0085546	total: 10.9s	remaining: 25.8s
85:	learn: 0.0084394	total: 11s	remaining: 25.6s
86:	learn: 0.0083915	total: 11.1s	remaining: 25.4s
87:	learn: 0.0081706	total: 11.2s	remaining: 25.1s
88:	learn: 0.0079518	total: 11.3s	remaining: 25s
89:	learn: 0.0078638	total: 11.4s	remaining: 24.8s
90:	learn: 0.0077859	total: 11.5s	remaining: 24.5s
91:	learn: 0.0076533	total: 11.6s	remaining: 24.4s
92:	learn: 0.0075814	total: 11.7s	remaining: 24.3s
93:	learn: 0.0074544	total: 11.8s	remaining: 24.1s
94:	learn: 0.0073730	total: 11.9s	remaining: 23.9s
95:	learn: 0.0073516	total: 12s	remaining: 23.7s
96:	learn: 0.0071918	total: 12.1s	remaining: 23.6s
97:	learn: 0.0071036	total: 12.2s	remaining: 23.4s
98:	learn: 0.0070150	total: 12.3s	remaining: 23.2s
99:	learn: 0.0068704	total: 12.4s	remaining: 23s
100:	learn: 0.0067096	total: 12.5s	remaining: 22.8s
101:	learn: 0.0066341	total: 12.6s	remaining: 22.7s
102:	learn: 0.0065519	total: 12.7s	remaining: 22.5s
103:	learn: 0.0064882	total: 12.8s	remaining: 22.3s
104:	learn: 0.0064144	total: 12.9s	remaining: 22.2s
105:	learn: 0.0063042	total: 13s	remaining: 22s
106:	learn: 0.0061786	total: 13.1s	remaining: 21.8s
107:	learn: 0.0060861	total: 13.1s	remaining: 21.7s
108:	learn: 0.0059914	total: 13.3s	remaining: 21.5s
109:	learn: 0.0059541	total: 13.3s	remaining: 21.4s
110:	learn: 0.0058742	total: 13.4s	remaining: 21.2s
111:	learn: 0.0058309	total: 13.6s	remaining: 21.1s
112:	learn: 0.0057375	total: 13.6s	remaining: 20.9s
113:	learn: 0.0056418	total: 13.7s	remaining: 20.7s
114:	learn: 0.0055825	total: 13.9s	remaining: 20.6s
115:	learn: 0.0055471	total: 13.9s	remaining: 20.4s
116:	learn: 0.0054739	total: 14s	remaining: 20.3s
117:	learn: 0.0054065	total: 14.1s	remaining: 20.1s
118:	learn: 0.0053155	total: 14.2s	remaining: 20s
119:	learn: 0.0052765	total: 14.3s	remaining: 19.8s
120:	learn: 0.0051909	total: 14.4s	remaining: 19.7s
121:	learn: 0.0050797	total: 14.5s	remaining: 19.5s
122:	learn: 0.0050020	total: 14.6s	remaining: 19.3s
123:	learn: 0.0049576	total: 14.7s	remaining: 19.2s
124:	learn: 0.0048332	total: 14.8s	remaining: 19.1s
125:	learn: 0.0047803	total: 14.9s	remaining: 18.9s
126:	learn: 0.0047210	total: 15s	remaining: 18.8s
127:	learn: 0.0047035	total: 15.1s	remaining: 18.6s
128:	learn: 0.0046365	total: 15.2s	remaining: 18.5s
129:	learn: 0.0045830	total: 15.3s	remaining: 18.3s
130:	learn: 0.0045524	total: 15.4s	remaining: 18.2s
131:	learn: 0.0045319	total: 15.4s	remaining: 18s
132:	learn: 0.0044490	total: 15.6s	remaining: 17.9s
133:	learn: 0.0044077	total: 15.7s	remaining: 17.8s
134:	learn: 0.0043843	total: 15.8s	remaining: 17.6s
135:	learn: 0.0043465	total: 15.9s	remaining: 17.5s
136:	learn: 0.0042994	total: 16s	remaining: 17.4s
137:	learn: 0.0042424	total: 16.1s	remaining: 17.2s
138:	learn: 0.0041945	total: 16.2s	remaining: 17.2s
139:	learn: 0.0041734	total: 16.4s	remaining: 17.1s
140:	learn: 0.0041448	total: 16.5s	remaining: 17s
141:	learn: 0.0041094	total: 16.6s	remaining: 16.9s
142:	learn: 0.0040391	total: 16.8s	remaining: 16.8s
143:	learn: 0.0040009	total: 17s	remaining: 16.8s
144:	learn: 0.0039768	total: 17.2s	remaining: 16.7s
145:	learn: 0.0039089	total: 17.4s	remaining: 16.6s
146:	learn: 0.0038430	total: 17.5s	remaining: 16.6s
147:	learn: 0.0037965	total: 17.7s	remaining: 16.5s
148:	learn: 0.0037690	total: 17.9s	remaining: 16.5s
149:	learn: 0.0037036	total: 18.1s	remaining: 16.4s
150:	learn: 0.0036882	total: 18.2s	remaining: 16.3s
151:	learn: 0.0036882	total: 18.4s	remaining: 16.2s
152:	learn: 0.0036596	total: 18.5s	remaining: 16.1s
153:	learn: 0.0036595	total: 18.7s	remaining: 16s
154:	learn: 0.0036595	total: 18.8s	remaining: 15.9s
155:	learn: 0.0036349	total: 19s	remaining: 15.8s
156:	learn: 0.0036085	total: 19.2s	remaining: 15.7s
157:	learn: 0.0035395	total: 19.3s	remaining: 15.7s
158:	learn: 0.0035161	total: 19.5s	remaining: 15.6s
159:	learn: 0.0035160	total: 19.6s	remaining: 15.5s
160:	learn: 0.0035160	total: 19.8s	remaining: 15.4s
161:	learn: 0.0035160	total: 19.9s	remaining: 15.2s
162:	learn: 0.0034843	total: 20.1s	remaining: 15.2s
163:	learn: 0.0034512	total: 20.2s	remaining: 15.1s
164:	learn: 0.0034080	total: 20.4s	remaining: 15s
165:	learn: 0.0034080	total: 20.6s	remaining: 14.9s
166:	learn: 0.0033954	total: 20.7s	remaining: 14.8s
167:	learn: 0.0033713	total: 20.9s	remaining: 14.7s
168:	learn: 0.0033713	total: 21s	remaining: 14.6s
169:	learn: 0.0033555	total: 21.2s	remaining: 14.5s
170:	learn: 0.0033555	total: 21.3s	remaining: 14.3s
171:	learn: 0.0033134	total: 21.5s	remaining: 14.3s
172:	learn: 0.0033134	total: 21.6s	remaining: 14.1s
173:	learn: 0.0032966	total: 21.8s	remaining: 14s
174:	learn: 0.0032965	total: 22s	remaining: 13.9s
175:	learn: 0.0032965	total: 22.1s	remaining: 13.8s
176:	learn: 0.0032966	total: 22.2s	remaining: 13.7s
177:	learn: 0.0032965	total: 22.3s	remaining: 13.6s
178:	learn: 0.0032964	total: 22.4s	remaining: 13.4s
179:	learn: 0.0032964	total: 22.5s	remaining: 13.3s
180:	learn: 0.0032964	total: 22.6s	remaining: 13.1s
181:	learn: 0.0032964	total: 22.7s	remaining: 13s
182:	learn: 0.0032964	total: 22.8s	remaining: 12.8s
183:	learn: 0.0032964	total: 22.8s	remaining: 12.7s
184:	learn: 0.0032964	total: 22.9s	remaining: 12.5s
185:	learn: 0.0032964	total: 23s	remaining: 12.4s
186:	learn: 0.0032963	total: 23.1s	remaining: 12.2s
187:	learn: 0.0032963	total: 23.1s	remaining: 12.1s
188:	learn: 0.0032800	total: 23.3s	remaining: 11.9s
189:	learn: 0.0032800	total: 23.3s	remaining: 11.8s
190:	learn: 0.0032799	total: 23.4s	remaining: 11.6s
191:	learn: 0.0032405	total: 23.5s	remaining: 11.5s
192:	learn: 0.0032405	total: 23.6s	remaining: 11.4s
193:	learn: 0.0032404	total: 23.7s	remaining: 11.2s
194:	learn: 0.0032404	total: 23.8s	remaining: 11.1s
195:	learn: 0.0032404	total: 23.8s	remaining: 10.9s
196:	learn: 0.0032403	total: 23.9s	remaining: 10.8s
197:	learn: 0.0032403	total: 24s	remaining: 10.7s
198:	learn: 0.0032403	total: 24.1s	remaining: 10.5s
199:	learn: 0.0032403	total: 24.2s	remaining: 10.4s
200:	learn: 0.0032403	total: 24.2s	remaining: 10.2s
201:	learn: 0.0032403	total: 24.3s	remaining: 10.1s
202:	learn: 0.0032403	total: 24.4s	remaining: 9.97s
203:	learn: 0.0032403	total: 24.5s	remaining: 9.83s
204:	learn: 0.0032060	total: 24.6s	remaining: 9.7s
205:	learn: 0.0032060	total: 24.7s	remaining: 9.57s
206:	learn: 0.0032060	total: 24.7s	remaining: 9.43s
207:	learn: 0.0032060	total: 24.8s	remaining: 9.29s
208:	learn: 0.0032059	total: 24.9s	remaining: 9.16s
209:	learn: 0.0032059	total: 25s	remaining: 9.03s
210:	learn: 0.0032059	total: 25s	remaining: 8.89s
211:	learn: 0.0032059	total: 25.1s	remaining: 8.76s
212:	learn: 0.0032058	total: 25.2s	remaining: 8.62s
213:	learn: 0.0032059	total: 25.3s	remaining: 8.5s
214:	learn: 0.0032058	total: 25.3s	remaining: 8.37s
215:	learn: 0.0032058	total: 25.4s	remaining: 8.24s
216:	learn: 0.0032058	total: 25.5s	remaining: 8.12s
217:	learn: 0.0032058	total: 25.6s	remaining: 7.98s
218:	learn: 0.0032058	total: 25.7s	remaining: 7.86s
219:	learn: 0.0032058	total: 25.8s	remaining: 7.73s
220:	learn: 0.0032058	total: 25.9s	remaining: 7.6s
221:	learn: 0.0031788	total: 26s	remaining: 7.49s
222:	learn: 0.0031744	total: 26s	remaining: 7.36s
223:	learn: 0.0031744	total: 26.1s	remaining: 7.22s
224:	learn: 0.0031744	total: 26.2s	remaining: 7.09s
225:	learn: 0.0031744	total: 26.2s	remaining: 6.97s
226:	learn: 0.0031744	total: 26.3s	remaining: 6.84s
227:	learn: 0.0031744	total: 26.4s	remaining: 6.71s
228:	learn: 0.0031744	total: 26.4s	remaining: 6.58s
229:	learn: 0.0031744	total: 26.5s	remaining: 6.46s
230:	learn: 0.0031744	total: 26.6s	remaining: 6.33s
231:	learn: 0.0031744	total: 26.7s	remaining: 6.21s
232:	learn: 0.0031480	total: 26.8s	remaining: 6.09s
233:	learn: 0.0031480	total: 26.8s	remaining: 5.96s
234:	learn: 0.0031480	total: 26.9s	remaining: 5.84s
235:	learn: 0.0031480	total: 27s	remaining: 5.72s
236:	learn: 0.0031313	total: 27.1s	remaining: 5.6s
237:	learn: 0.0030946	total: 27.2s	remaining: 5.48s
238:	learn: 0.0030821	total: 27.3s	remaining: 5.37s
239:	learn: 0.0030668	total: 27.4s	remaining: 5.25s
240:	learn: 0.0030433	total: 27.5s	remaining: 5.13s
241:	learn: 0.0030433	total: 27.6s	remaining: 5.01s
242:	learn: 0.0030433	total: 27.6s	remaining: 4.89s
243:	learn: 0.0030433	total: 27.7s	remaining: 4.77s
244:	learn: 0.0030432	total: 27.8s	remaining: 4.64s
245:	learn: 0.0030432	total: 27.9s	remaining: 4.53s
246:	learn: 0.0030432	total: 27.9s	remaining: 4.41s
247:	learn: 0.0030432	total: 28s	remaining: 4.29s
248:	learn: 0.0030432	total: 28.1s	remaining: 4.17s
249:	learn: 0.0030432	total: 28.1s	remaining: 4.05s
250:	learn: 0.0030432	total: 28.2s	remaining: 3.93s
251:	learn: 0.0030431	total: 28.3s	remaining: 3.81s
252:	learn: 0.0029912	total: 28.4s	remaining: 3.7s
253:	learn: 0.0029912	total: 28.5s	remaining: 3.59s
254:	learn: 0.0029912	total: 28.5s	remaining: 3.47s
255:	learn: 0.0029911	total: 28.6s	remaining: 3.35s
256:	learn: 0.0029911	total: 28.7s	remaining: 3.24s
257:	learn: 0.0029911	total: 28.8s	remaining: 3.12s
258:	learn: 0.0029911	total: 28.8s	remaining: 3s
259:	learn: 0.0029911	total: 28.9s	remaining: 2.89s
260:	learn: 0.0029911	total: 29s	remaining: 2.78s
261:	learn: 0.0029911	total: 29.1s	remaining: 2.66s
262:	learn: 0.0029911	total: 29.1s	remaining: 2.55s
263:	learn: 0.0029911	total: 29.2s	remaining: 2.43s
264:	learn: 0.0029911	total: 29.3s	remaining: 2.32s
265:	learn: 0.0029910	total: 29.4s	remaining: 2.21s
266:	learn: 0.0029910	total: 29.4s	remaining: 2.09s
267:	learn: 0.0029910	total: 29.5s	remaining: 1.98s
268:	learn: 0.0029910	total: 29.6s	remaining: 1.87s
269:	learn: 0.0029910	total: 29.7s	remaining: 1.76s
270:	learn: 0.0029909	total: 29.8s	remaining: 1.65s
271:	learn: 0.0029909	total: 29.9s	remaining: 1.54s
272:	learn: 0.0029909	total: 29.9s	remaining: 1.43s
273:	learn: 0.0029909	total: 30s	remaining: 1.31s
274:	learn: 0.0029909	total: 30.1s	remaining: 1.2s
275:	learn: 0.0029909	total: 30.2s	remaining: 1.09s
276:	learn: 0.0029909	total: 30.2s	remaining: 982ms
277:	learn: 0.0029909	total: 30.3s	remaining: 872ms
278:	learn: 0.0029909	total: 30.4s	remaining: 762ms
279:	learn: 0.0029908	total: 30.5s	remaining: 653ms
280:	learn: 0.0029908	total: 30.5s	remaining: 543ms
281:	learn: 0.0029908	total: 30.6s	remaining: 434ms
282:	learn: 0.0029907	total: 30.7s	remaining: 325ms
283:	learn: 0.0029908	total: 30.8s	remaining: 217ms
284:	learn: 0.0029908	total: 30.8s	remaining: 108ms
285:	learn: 0.0029907	total: 30.9s	remaining: 0us
[I 2024-12-19 15:09:06,095] Trial 45 finished with value: 79.66196040263941 and parameters: {'learning_rate': 0.08540577895279673, 'max_depth': 6, 'n_estimators': 286, 'scale_pos_weight': 5.7780633804501385}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.76
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.88
 - F1-Score_Train: 99.88
 - Precision_Test: 30.57
 - Recall_Test: 84.92
 - AUPRC_Test: 79.96
 - Accuracy_Test: 99.65
 - F1-Score_Test: 44.96
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 286
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.78
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 79.6620

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5686996	total: 174ms	remaining: 47.4s
1:	learn: 0.4436812	total: 347ms	remaining: 47s
2:	learn: 0.3689664	total: 520ms	remaining: 46.8s
3:	learn: 0.3036053	total: 677ms	remaining: 45.5s
4:	learn: 0.2403241	total: 865ms	remaining: 46.4s
5:	learn: 0.1989259	total: 1.03s	remaining: 46s
6:	learn: 0.1676443	total: 1.24s	remaining: 47.2s
7:	learn: 0.1406843	total: 1.42s	remaining: 47s
8:	learn: 0.1235290	total: 1.57s	remaining: 46.2s
9:	learn: 0.1084472	total: 1.75s	remaining: 46s
10:	learn: 0.0995889	total: 1.93s	remaining: 46.1s
11:	learn: 0.0912421	total: 2.12s	remaining: 46.2s
12:	learn: 0.0835641	total: 2.29s	remaining: 45.8s
13:	learn: 0.0761062	total: 2.45s	remaining: 45.4s
14:	learn: 0.0703323	total: 2.69s	remaining: 46.2s
15:	learn: 0.0667598	total: 2.85s	remaining: 45.8s
16:	learn: 0.0618356	total: 3.05s	remaining: 45.9s
17:	learn: 0.0579120	total: 3.21s	remaining: 45.5s
18:	learn: 0.0549274	total: 3.38s	remaining: 45.1s
19:	learn: 0.0526844	total: 3.54s	remaining: 44.8s
20:	learn: 0.0505237	total: 3.72s	remaining: 44.7s
21:	learn: 0.0487717	total: 3.89s	remaining: 44.4s
22:	learn: 0.0463464	total: 4.08s	remaining: 44.3s
23:	learn: 0.0446366	total: 4.27s	remaining: 44.3s
24:	learn: 0.0427196	total: 4.43s	remaining: 44s
25:	learn: 0.0412746	total: 4.6s	remaining: 43.7s
26:	learn: 0.0402561	total: 4.78s	remaining: 43.6s
27:	learn: 0.0390197	total: 4.88s	remaining: 42.7s
28:	learn: 0.0380311	total: 4.98s	remaining: 41.9s
29:	learn: 0.0367795	total: 5.1s	remaining: 41.3s
30:	learn: 0.0356745	total: 5.19s	remaining: 40.5s
31:	learn: 0.0348898	total: 5.31s	remaining: 40s
32:	learn: 0.0340669	total: 5.42s	remaining: 39.4s
33:	learn: 0.0334850	total: 5.51s	remaining: 38.8s
34:	learn: 0.0325607	total: 5.63s	remaining: 38.3s
35:	learn: 0.0315727	total: 5.72s	remaining: 37.7s
36:	learn: 0.0307229	total: 5.81s	remaining: 37.1s
37:	learn: 0.0298756	total: 5.93s	remaining: 36.7s
38:	learn: 0.0292873	total: 6.02s	remaining: 36.1s
39:	learn: 0.0285636	total: 6.1s	remaining: 35.5s
40:	learn: 0.0281344	total: 6.21s	remaining: 35.1s
41:	learn: 0.0275847	total: 6.31s	remaining: 34.7s
42:	learn: 0.0270640	total: 6.4s	remaining: 34.2s
43:	learn: 0.0265052	total: 6.51s	remaining: 33.9s
44:	learn: 0.0259544	total: 6.6s	remaining: 33.4s
45:	learn: 0.0254058	total: 6.69s	remaining: 33s
46:	learn: 0.0246986	total: 6.8s	remaining: 32.7s
47:	learn: 0.0241070	total: 6.9s	remaining: 32.3s
48:	learn: 0.0236883	total: 6.99s	remaining: 31.9s
49:	learn: 0.0232257	total: 7.1s	remaining: 31.7s
50:	learn: 0.0228057	total: 7.19s	remaining: 31.3s
51:	learn: 0.0223718	total: 7.28s	remaining: 30.9s
52:	learn: 0.0218381	total: 7.41s	remaining: 30.8s
53:	learn: 0.0212198	total: 7.5s	remaining: 30.4s
54:	learn: 0.0206774	total: 7.6s	remaining: 30.1s
55:	learn: 0.0201385	total: 7.72s	remaining: 29.9s
56:	learn: 0.0198772	total: 7.8s	remaining: 29.6s
57:	learn: 0.0194352	total: 7.91s	remaining: 29.3s
58:	learn: 0.0190566	total: 8.03s	remaining: 29.1s
59:	learn: 0.0186497	total: 8.12s	remaining: 28.8s
60:	learn: 0.0183453	total: 8.2s	remaining: 28.5s
61:	learn: 0.0180090	total: 8.35s	remaining: 28.4s
62:	learn: 0.0176890	total: 8.45s	remaining: 28.2s
63:	learn: 0.0174053	total: 8.54s	remaining: 27.9s
64:	learn: 0.0171568	total: 8.64s	remaining: 27.7s
65:	learn: 0.0168458	total: 8.73s	remaining: 27.4s
66:	learn: 0.0166079	total: 8.81s	remaining: 27.1s
67:	learn: 0.0161532	total: 8.95s	remaining: 27s
68:	learn: 0.0158381	total: 9.04s	remaining: 26.7s
69:	learn: 0.0155601	total: 9.13s	remaining: 26.5s
70:	learn: 0.0152909	total: 9.25s	remaining: 26.3s
71:	learn: 0.0150140	total: 9.33s	remaining: 26.1s
72:	learn: 0.0147274	total: 9.43s	remaining: 25.8s
73:	learn: 0.0144391	total: 9.55s	remaining: 25.7s
74:	learn: 0.0142176	total: 9.64s	remaining: 25.4s
75:	learn: 0.0139522	total: 9.73s	remaining: 25.2s
76:	learn: 0.0137969	total: 9.83s	remaining: 25s
77:	learn: 0.0135116	total: 9.93s	remaining: 24.8s
78:	learn: 0.0133906	total: 10s	remaining: 24.6s
79:	learn: 0.0131664	total: 10.1s	remaining: 24.4s
80:	learn: 0.0130148	total: 10.2s	remaining: 24.2s
81:	learn: 0.0127925	total: 10.3s	remaining: 24s
82:	learn: 0.0126067	total: 10.4s	remaining: 23.8s
83:	learn: 0.0122731	total: 10.5s	remaining: 23.6s
84:	learn: 0.0120471	total: 10.6s	remaining: 23.4s
85:	learn: 0.0118407	total: 10.7s	remaining: 23.3s
86:	learn: 0.0116359	total: 10.8s	remaining: 23.1s
87:	learn: 0.0114377	total: 10.9s	remaining: 22.9s
88:	learn: 0.0112677	total: 11s	remaining: 22.8s
89:	learn: 0.0110292	total: 11.1s	remaining: 22.6s
90:	learn: 0.0108972	total: 11.2s	remaining: 22.4s
91:	learn: 0.0107511	total: 11.3s	remaining: 22.3s
92:	learn: 0.0106861	total: 11.4s	remaining: 22.1s
93:	learn: 0.0104626	total: 11.5s	remaining: 21.9s
94:	learn: 0.0103212	total: 11.6s	remaining: 21.8s
95:	learn: 0.0101370	total: 11.7s	remaining: 21.6s
96:	learn: 0.0100210	total: 11.8s	remaining: 21.4s
97:	learn: 0.0098012	total: 11.9s	remaining: 21.3s
98:	learn: 0.0096869	total: 12s	remaining: 21.1s
99:	learn: 0.0095440	total: 12.1s	remaining: 20.9s
100:	learn: 0.0094533	total: 12.2s	remaining: 20.8s
101:	learn: 0.0093607	total: 12.3s	remaining: 20.6s
102:	learn: 0.0092079	total: 12.4s	remaining: 20.4s
103:	learn: 0.0091464	total: 12.5s	remaining: 20.3s
104:	learn: 0.0090841	total: 12.6s	remaining: 20.2s
105:	learn: 0.0089576	total: 12.7s	remaining: 20s
106:	learn: 0.0087887	total: 12.8s	remaining: 19.9s
107:	learn: 0.0086442	total: 12.9s	remaining: 19.7s
108:	learn: 0.0085563	total: 13s	remaining: 19.6s
109:	learn: 0.0084353	total: 13.1s	remaining: 19.4s
110:	learn: 0.0083047	total: 13.2s	remaining: 19.3s
111:	learn: 0.0081937	total: 13.3s	remaining: 19.1s
112:	learn: 0.0080246	total: 13.4s	remaining: 19s
113:	learn: 0.0079826	total: 13.6s	remaining: 18.9s
114:	learn: 0.0079023	total: 13.6s	remaining: 18.7s
115:	learn: 0.0077814	total: 13.8s	remaining: 18.6s
116:	learn: 0.0077199	total: 13.8s	remaining: 18.5s
117:	learn: 0.0076390	total: 13.9s	remaining: 18.3s
118:	learn: 0.0075523	total: 14s	remaining: 18.2s
119:	learn: 0.0074461	total: 14.1s	remaining: 18s
120:	learn: 0.0073873	total: 14.2s	remaining: 17.9s
121:	learn: 0.0073057	total: 14.3s	remaining: 17.7s
122:	learn: 0.0072579	total: 14.4s	remaining: 17.6s
123:	learn: 0.0071727	total: 14.5s	remaining: 17.4s
124:	learn: 0.0071255	total: 14.6s	remaining: 17.3s
125:	learn: 0.0070734	total: 14.7s	remaining: 17.2s
126:	learn: 0.0069884	total: 14.8s	remaining: 17s
127:	learn: 0.0069126	total: 15s	remaining: 17s
128:	learn: 0.0068359	total: 15.2s	remaining: 16.9s
129:	learn: 0.0067476	total: 15.4s	remaining: 16.9s
130:	learn: 0.0066958	total: 15.5s	remaining: 16.8s
131:	learn: 0.0066409	total: 15.7s	remaining: 16.8s
132:	learn: 0.0065926	total: 15.9s	remaining: 16.7s
133:	learn: 0.0065428	total: 16s	remaining: 16.6s
134:	learn: 0.0064702	total: 16.2s	remaining: 16.6s
135:	learn: 0.0064170	total: 16.4s	remaining: 16.5s
136:	learn: 0.0063302	total: 16.5s	remaining: 16.4s
137:	learn: 0.0062887	total: 16.7s	remaining: 16.3s
138:	learn: 0.0062323	total: 16.9s	remaining: 16.3s
139:	learn: 0.0061756	total: 17.1s	remaining: 16.2s
140:	learn: 0.0061376	total: 17.2s	remaining: 16.1s
141:	learn: 0.0060928	total: 17.4s	remaining: 16s
142:	learn: 0.0060132	total: 17.5s	remaining: 16s
143:	learn: 0.0059568	total: 17.7s	remaining: 15.9s
144:	learn: 0.0059218	total: 17.9s	remaining: 15.8s
145:	learn: 0.0058726	total: 18.1s	remaining: 15.8s
146:	learn: 0.0058038	total: 18.3s	remaining: 15.7s
147:	learn: 0.0057551	total: 18.5s	remaining: 15.6s
148:	learn: 0.0057253	total: 18.6s	remaining: 15.5s
149:	learn: 0.0056719	total: 18.8s	remaining: 15.4s
150:	learn: 0.0056057	total: 19s	remaining: 15.3s
151:	learn: 0.0055676	total: 19.2s	remaining: 15.3s
152:	learn: 0.0055195	total: 19.3s	remaining: 15.2s
153:	learn: 0.0054655	total: 19.5s	remaining: 15.1s
154:	learn: 0.0053779	total: 19.7s	remaining: 15s
155:	learn: 0.0053709	total: 19.9s	remaining: 14.9s
156:	learn: 0.0053423	total: 20s	remaining: 14.8s
157:	learn: 0.0053189	total: 20.2s	remaining: 14.7s
158:	learn: 0.0052781	total: 20.3s	remaining: 14.6s
159:	learn: 0.0052436	total: 20.4s	remaining: 14.4s
160:	learn: 0.0051970	total: 20.5s	remaining: 14.3s
161:	learn: 0.0051565	total: 20.6s	remaining: 14.1s
162:	learn: 0.0051309	total: 20.7s	remaining: 13.9s
163:	learn: 0.0051086	total: 20.8s	remaining: 13.8s
164:	learn: 0.0050546	total: 20.9s	remaining: 13.7s
165:	learn: 0.0050110	total: 21s	remaining: 13.5s
166:	learn: 0.0049923	total: 21.1s	remaining: 13.4s
167:	learn: 0.0049348	total: 21.2s	remaining: 13.2s
168:	learn: 0.0048971	total: 21.3s	remaining: 13.1s
169:	learn: 0.0048668	total: 21.4s	remaining: 12.9s
170:	learn: 0.0048114	total: 21.5s	remaining: 12.8s
171:	learn: 0.0047703	total: 21.5s	remaining: 12.7s
172:	learn: 0.0047369	total: 21.7s	remaining: 12.5s
173:	learn: 0.0046894	total: 21.7s	remaining: 12.4s
174:	learn: 0.0046707	total: 21.8s	remaining: 12.2s
175:	learn: 0.0046463	total: 21.9s	remaining: 12.1s
176:	learn: 0.0045862	total: 22s	remaining: 12s
177:	learn: 0.0045543	total: 22.2s	remaining: 11.8s
178:	learn: 0.0045348	total: 22.3s	remaining: 11.7s
179:	learn: 0.0044884	total: 22.3s	remaining: 11.5s
180:	learn: 0.0044667	total: 22.5s	remaining: 11.4s
181:	learn: 0.0044423	total: 22.5s	remaining: 11.3s
182:	learn: 0.0044093	total: 22.6s	remaining: 11.1s
183:	learn: 0.0043825	total: 22.7s	remaining: 11s
184:	learn: 0.0043545	total: 22.8s	remaining: 10.9s
185:	learn: 0.0043204	total: 22.9s	remaining: 10.7s
186:	learn: 0.0042867	total: 23.1s	remaining: 10.6s
187:	learn: 0.0042492	total: 23.1s	remaining: 10.5s
188:	learn: 0.0042253	total: 23.2s	remaining: 10.3s
189:	learn: 0.0042253	total: 23.3s	remaining: 10.2s
190:	learn: 0.0042081	total: 23.4s	remaining: 10s
191:	learn: 0.0041839	total: 23.5s	remaining: 9.91s
192:	learn: 0.0041526	total: 23.6s	remaining: 9.79s
193:	learn: 0.0041103	total: 23.7s	remaining: 9.66s
194:	learn: 0.0040924	total: 23.8s	remaining: 9.52s
195:	learn: 0.0040393	total: 23.9s	remaining: 9.4s
196:	learn: 0.0040128	total: 24s	remaining: 9.27s
197:	learn: 0.0039947	total: 24.1s	remaining: 9.13s
198:	learn: 0.0039518	total: 24.2s	remaining: 9.01s
199:	learn: 0.0039325	total: 24.3s	remaining: 8.87s
200:	learn: 0.0039180	total: 24.4s	remaining: 8.73s
201:	learn: 0.0038976	total: 24.5s	remaining: 8.61s
202:	learn: 0.0038829	total: 24.6s	remaining: 8.47s
203:	learn: 0.0038270	total: 24.7s	remaining: 8.34s
204:	learn: 0.0038070	total: 24.8s	remaining: 8.22s
205:	learn: 0.0037789	total: 24.9s	remaining: 8.09s
206:	learn: 0.0037789	total: 24.9s	remaining: 7.95s
207:	learn: 0.0037549	total: 25s	remaining: 7.83s
208:	learn: 0.0037549	total: 25.1s	remaining: 7.69s
209:	learn: 0.0037468	total: 25.2s	remaining: 7.56s
210:	learn: 0.0037187	total: 25.3s	remaining: 7.44s
211:	learn: 0.0037040	total: 25.4s	remaining: 7.3s
212:	learn: 0.0036888	total: 25.5s	remaining: 7.17s
213:	learn: 0.0036888	total: 25.6s	remaining: 7.05s
214:	learn: 0.0036795	total: 25.6s	remaining: 6.92s
215:	learn: 0.0036795	total: 25.7s	remaining: 6.78s
216:	learn: 0.0036667	total: 25.8s	remaining: 6.66s
217:	learn: 0.0036435	total: 25.9s	remaining: 6.54s
218:	learn: 0.0036051	total: 26s	remaining: 6.41s
219:	learn: 0.0035646	total: 26.1s	remaining: 6.3s
220:	learn: 0.0035500	total: 26.2s	remaining: 6.17s
221:	learn: 0.0035500	total: 26.3s	remaining: 6.04s
222:	learn: 0.0035362	total: 26.4s	remaining: 5.92s
223:	learn: 0.0035011	total: 26.5s	remaining: 5.79s
224:	learn: 0.0034952	total: 26.6s	remaining: 5.67s
225:	learn: 0.0034679	total: 26.7s	remaining: 5.55s
226:	learn: 0.0034418	total: 26.8s	remaining: 5.42s
227:	learn: 0.0034223	total: 26.9s	remaining: 5.3s
228:	learn: 0.0034223	total: 26.9s	remaining: 5.18s
229:	learn: 0.0034093	total: 27s	remaining: 5.05s
230:	learn: 0.0033720	total: 27.1s	remaining: 4.93s
231:	learn: 0.0033720	total: 27.2s	remaining: 4.81s
232:	learn: 0.0033720	total: 27.3s	remaining: 4.69s
233:	learn: 0.0033614	total: 27.4s	remaining: 4.56s
234:	learn: 0.0033614	total: 27.5s	remaining: 4.44s
235:	learn: 0.0033613	total: 27.5s	remaining: 4.32s
236:	learn: 0.0033510	total: 27.6s	remaining: 4.2s
237:	learn: 0.0033510	total: 27.7s	remaining: 4.08s
238:	learn: 0.0033405	total: 27.8s	remaining: 3.96s
239:	learn: 0.0033165	total: 27.9s	remaining: 3.84s
240:	learn: 0.0033067	total: 28s	remaining: 3.72s
241:	learn: 0.0032838	total: 28.1s	remaining: 3.6s
242:	learn: 0.0032838	total: 28.2s	remaining: 3.48s
243:	learn: 0.0032838	total: 28.3s	remaining: 3.36s
244:	learn: 0.0032785	total: 28.4s	remaining: 3.24s
245:	learn: 0.0032785	total: 28.5s	remaining: 3.12s
246:	learn: 0.0032666	total: 28.6s	remaining: 3.01s
247:	learn: 0.0032493	total: 28.7s	remaining: 2.89s
248:	learn: 0.0032328	total: 28.7s	remaining: 2.77s
249:	learn: 0.0032328	total: 28.8s	remaining: 2.65s
250:	learn: 0.0032139	total: 28.9s	remaining: 2.53s
251:	learn: 0.0032139	total: 29s	remaining: 2.41s
252:	learn: 0.0032138	total: 29.1s	remaining: 2.3s
253:	learn: 0.0032138	total: 29.1s	remaining: 2.18s
254:	learn: 0.0032138	total: 29.2s	remaining: 2.06s
255:	learn: 0.0032138	total: 29.3s	remaining: 1.95s
256:	learn: 0.0032138	total: 29.4s	remaining: 1.83s
257:	learn: 0.0032083	total: 29.5s	remaining: 1.71s
258:	learn: 0.0032083	total: 29.5s	remaining: 1.59s
259:	learn: 0.0032083	total: 29.6s	remaining: 1.48s
260:	learn: 0.0032083	total: 29.7s	remaining: 1.36s
261:	learn: 0.0032082	total: 29.8s	remaining: 1.25s
262:	learn: 0.0032082	total: 29.8s	remaining: 1.13s
263:	learn: 0.0032082	total: 29.9s	remaining: 1.02s
264:	learn: 0.0031918	total: 30s	remaining: 906ms
265:	learn: 0.0031918	total: 30.1s	remaining: 792ms
266:	learn: 0.0031918	total: 30.2s	remaining: 678ms
267:	learn: 0.0031918	total: 30.3s	remaining: 565ms
268:	learn: 0.0031918	total: 30.4s	remaining: 452ms
269:	learn: 0.0031918	total: 30.5s	remaining: 339ms
270:	learn: 0.0031918	total: 30.7s	remaining: 226ms
271:	learn: 0.0031918	total: 30.8s	remaining: 113ms
272:	learn: 0.0031918	total: 31s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.68
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.84
 - F1-Score_Train: 99.84
 - Precision_Test: 25.00
 - Recall_Test: 85.71
 - AUPRC_Test: 80.11
 - Accuracy_Test: 99.54
 - F1-Score_Test: 38.71
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 273
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.08
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5648188	total: 172ms	remaining: 46.9s
1:	learn: 0.4603887	total: 332ms	remaining: 45s
2:	learn: 0.3771037	total: 506ms	remaining: 45.6s
3:	learn: 0.3152849	total: 686ms	remaining: 46.1s
4:	learn: 0.2700953	total: 876ms	remaining: 46.9s
5:	learn: 0.2316547	total: 1.06s	remaining: 47.2s
6:	learn: 0.2076906	total: 1.25s	remaining: 47.4s
7:	learn: 0.1788880	total: 1.42s	remaining: 46.9s
8:	learn: 0.1618382	total: 1.61s	remaining: 47.2s
9:	learn: 0.1493208	total: 1.75s	remaining: 46s
10:	learn: 0.1372725	total: 1.87s	remaining: 44.5s
11:	learn: 0.1252795	total: 1.97s	remaining: 42.8s
12:	learn: 0.1172432	total: 2.05s	remaining: 41.1s
13:	learn: 0.1089033	total: 2.17s	remaining: 40.2s
14:	learn: 0.1027507	total: 2.28s	remaining: 39.2s
15:	learn: 0.0966321	total: 2.37s	remaining: 38.1s
16:	learn: 0.0911151	total: 2.5s	remaining: 37.6s
17:	learn: 0.0864605	total: 2.59s	remaining: 36.7s
18:	learn: 0.0816901	total: 2.71s	remaining: 36.2s
19:	learn: 0.0776894	total: 2.8s	remaining: 35.4s
20:	learn: 0.0747173	total: 2.91s	remaining: 34.9s
21:	learn: 0.0724211	total: 3s	remaining: 34.2s
22:	learn: 0.0701649	total: 3.12s	remaining: 34s
23:	learn: 0.0669515	total: 3.22s	remaining: 33.4s
24:	learn: 0.0640053	total: 3.32s	remaining: 33s
25:	learn: 0.0616331	total: 3.46s	remaining: 32.9s
26:	learn: 0.0589735	total: 3.56s	remaining: 32.4s
27:	learn: 0.0575985	total: 3.65s	remaining: 31.9s
28:	learn: 0.0559443	total: 3.78s	remaining: 31.8s
29:	learn: 0.0543328	total: 3.87s	remaining: 31.4s
30:	learn: 0.0531828	total: 3.96s	remaining: 30.9s
31:	learn: 0.0516946	total: 4.08s	remaining: 30.7s
32:	learn: 0.0504191	total: 4.16s	remaining: 30.3s
33:	learn: 0.0492374	total: 4.25s	remaining: 29.9s
34:	learn: 0.0482401	total: 4.38s	remaining: 29.8s
35:	learn: 0.0469435	total: 4.48s	remaining: 29.5s
36:	learn: 0.0457384	total: 4.56s	remaining: 29.1s
37:	learn: 0.0444954	total: 4.68s	remaining: 28.9s
38:	learn: 0.0437292	total: 4.76s	remaining: 28.6s
39:	learn: 0.0426111	total: 4.86s	remaining: 28.3s
40:	learn: 0.0412998	total: 4.98s	remaining: 28.2s
41:	learn: 0.0405433	total: 5.06s	remaining: 27.9s
42:	learn: 0.0394300	total: 5.15s	remaining: 27.6s
43:	learn: 0.0388839	total: 5.26s	remaining: 27.4s
44:	learn: 0.0379802	total: 5.37s	remaining: 27.2s
45:	learn: 0.0375090	total: 5.46s	remaining: 27s
46:	learn: 0.0368560	total: 5.58s	remaining: 26.8s
47:	learn: 0.0357961	total: 5.66s	remaining: 26.5s
48:	learn: 0.0351795	total: 5.75s	remaining: 26.3s
49:	learn: 0.0344059	total: 5.86s	remaining: 26.1s
50:	learn: 0.0333934	total: 5.95s	remaining: 25.9s
51:	learn: 0.0326618	total: 6.05s	remaining: 25.7s
52:	learn: 0.0322940	total: 6.16s	remaining: 25.6s
53:	learn: 0.0317605	total: 6.25s	remaining: 25.3s
54:	learn: 0.0310982	total: 6.34s	remaining: 25.1s
55:	learn: 0.0306701	total: 6.51s	remaining: 25.2s
56:	learn: 0.0300571	total: 6.59s	remaining: 25s
57:	learn: 0.0293619	total: 6.69s	remaining: 24.8s
58:	learn: 0.0288505	total: 6.8s	remaining: 24.7s
59:	learn: 0.0281571	total: 6.89s	remaining: 24.5s
60:	learn: 0.0277583	total: 6.97s	remaining: 24.2s
61:	learn: 0.0272956	total: 7.09s	remaining: 24.1s
62:	learn: 0.0270229	total: 7.17s	remaining: 23.9s
63:	learn: 0.0265066	total: 7.25s	remaining: 23.7s
64:	learn: 0.0261924	total: 7.36s	remaining: 23.6s
65:	learn: 0.0256512	total: 7.47s	remaining: 23.4s
66:	learn: 0.0252985	total: 7.55s	remaining: 23.2s
67:	learn: 0.0248705	total: 7.66s	remaining: 23.1s
68:	learn: 0.0245332	total: 7.75s	remaining: 22.9s
69:	learn: 0.0240399	total: 7.83s	remaining: 22.7s
70:	learn: 0.0236184	total: 7.96s	remaining: 22.6s
71:	learn: 0.0233492	total: 8.04s	remaining: 22.4s
72:	learn: 0.0229855	total: 8.13s	remaining: 22.3s
73:	learn: 0.0224774	total: 8.24s	remaining: 22.2s
74:	learn: 0.0219438	total: 8.33s	remaining: 22s
75:	learn: 0.0215099	total: 8.43s	remaining: 21.9s
76:	learn: 0.0211042	total: 8.56s	remaining: 21.8s
77:	learn: 0.0208316	total: 8.64s	remaining: 21.6s
78:	learn: 0.0203342	total: 8.73s	remaining: 21.4s
79:	learn: 0.0199894	total: 8.85s	remaining: 21.3s
80:	learn: 0.0196413	total: 8.93s	remaining: 21.2s
81:	learn: 0.0192286	total: 9.02s	remaining: 21s
82:	learn: 0.0188487	total: 9.13s	remaining: 20.9s
83:	learn: 0.0184928	total: 9.21s	remaining: 20.7s
84:	learn: 0.0182623	total: 9.29s	remaining: 20.6s
85:	learn: 0.0180571	total: 9.42s	remaining: 20.5s
86:	learn: 0.0176957	total: 9.53s	remaining: 20.4s
87:	learn: 0.0174881	total: 9.61s	remaining: 20.2s
88:	learn: 0.0172644	total: 9.71s	remaining: 20.1s
89:	learn: 0.0170380	total: 9.8s	remaining: 19.9s
90:	learn: 0.0167400	total: 9.89s	remaining: 19.8s
91:	learn: 0.0166186	total: 10s	remaining: 19.7s
92:	learn: 0.0165080	total: 10.1s	remaining: 19.5s
93:	learn: 0.0163653	total: 10.2s	remaining: 19.4s
94:	learn: 0.0160751	total: 10.3s	remaining: 19.3s
95:	learn: 0.0158706	total: 10.4s	remaining: 19.1s
96:	learn: 0.0156864	total: 10.5s	remaining: 19s
97:	learn: 0.0154644	total: 10.6s	remaining: 18.9s
98:	learn: 0.0152351	total: 10.7s	remaining: 18.8s
99:	learn: 0.0150748	total: 10.8s	remaining: 18.6s
100:	learn: 0.0148195	total: 10.9s	remaining: 18.5s
101:	learn: 0.0146507	total: 11s	remaining: 18.4s
102:	learn: 0.0142863	total: 11.1s	remaining: 18.2s
103:	learn: 0.0141827	total: 11.2s	remaining: 18.1s
104:	learn: 0.0139736	total: 11.3s	remaining: 18s
105:	learn: 0.0138160	total: 11.3s	remaining: 17.9s
106:	learn: 0.0137158	total: 11.5s	remaining: 17.8s
107:	learn: 0.0135988	total: 11.6s	remaining: 17.7s
108:	learn: 0.0133993	total: 11.7s	remaining: 17.6s
109:	learn: 0.0131981	total: 11.8s	remaining: 17.5s
110:	learn: 0.0130624	total: 12s	remaining: 17.5s
111:	learn: 0.0129105	total: 12.1s	remaining: 17.5s
112:	learn: 0.0128223	total: 12.3s	remaining: 17.4s
113:	learn: 0.0126252	total: 12.5s	remaining: 17.4s
114:	learn: 0.0125014	total: 12.7s	remaining: 17.4s
115:	learn: 0.0123929	total: 12.9s	remaining: 17.4s
116:	learn: 0.0121972	total: 13s	remaining: 17.4s
117:	learn: 0.0119587	total: 13.2s	remaining: 17.4s
118:	learn: 0.0117625	total: 13.4s	remaining: 17.3s
119:	learn: 0.0116173	total: 13.6s	remaining: 17.3s
120:	learn: 0.0114948	total: 13.7s	remaining: 17.3s
121:	learn: 0.0114073	total: 13.9s	remaining: 17.2s
122:	learn: 0.0112845	total: 14.1s	remaining: 17.2s
123:	learn: 0.0111292	total: 14.3s	remaining: 17.1s
124:	learn: 0.0109955	total: 14.4s	remaining: 17.1s
125:	learn: 0.0109046	total: 14.6s	remaining: 17s
126:	learn: 0.0107471	total: 14.8s	remaining: 17s
127:	learn: 0.0106080	total: 15s	remaining: 17s
128:	learn: 0.0105363	total: 15.2s	remaining: 16.9s
129:	learn: 0.0104261	total: 15.3s	remaining: 16.9s
130:	learn: 0.0102822	total: 15.5s	remaining: 16.8s
131:	learn: 0.0101236	total: 15.7s	remaining: 16.8s
132:	learn: 0.0100236	total: 15.9s	remaining: 16.7s
133:	learn: 0.0099103	total: 16s	remaining: 16.6s
134:	learn: 0.0098738	total: 16.2s	remaining: 16.6s
135:	learn: 0.0098045	total: 16.4s	remaining: 16.5s
136:	learn: 0.0096866	total: 16.5s	remaining: 16.4s
137:	learn: 0.0096013	total: 16.7s	remaining: 16.4s
138:	learn: 0.0095601	total: 16.9s	remaining: 16.3s
139:	learn: 0.0094921	total: 17.1s	remaining: 16.2s
140:	learn: 0.0093702	total: 17.2s	remaining: 16.1s
141:	learn: 0.0092854	total: 17.4s	remaining: 16.1s
142:	learn: 0.0092434	total: 17.6s	remaining: 16s
143:	learn: 0.0091414	total: 17.7s	remaining: 15.9s
144:	learn: 0.0090561	total: 17.9s	remaining: 15.8s
145:	learn: 0.0089609	total: 18s	remaining: 15.7s
146:	learn: 0.0089290	total: 18.1s	remaining: 15.5s
147:	learn: 0.0088230	total: 18.2s	remaining: 15.4s
148:	learn: 0.0087502	total: 18.3s	remaining: 15.2s
149:	learn: 0.0087000	total: 18.4s	remaining: 15.1s
150:	learn: 0.0086077	total: 18.5s	remaining: 14.9s
151:	learn: 0.0085105	total: 18.6s	remaining: 14.8s
152:	learn: 0.0084600	total: 18.7s	remaining: 14.6s
153:	learn: 0.0083372	total: 18.8s	remaining: 14.5s
154:	learn: 0.0082271	total: 18.9s	remaining: 14.4s
155:	learn: 0.0081493	total: 19s	remaining: 14.2s
156:	learn: 0.0081122	total: 19.1s	remaining: 14.1s
157:	learn: 0.0080786	total: 19.2s	remaining: 13.9s
158:	learn: 0.0080325	total: 19.2s	remaining: 13.8s
159:	learn: 0.0078966	total: 19.3s	remaining: 13.7s
160:	learn: 0.0078409	total: 19.4s	remaining: 13.5s
161:	learn: 0.0077962	total: 19.5s	remaining: 13.4s
162:	learn: 0.0077471	total: 19.6s	remaining: 13.2s
163:	learn: 0.0076459	total: 19.7s	remaining: 13.1s
164:	learn: 0.0075896	total: 19.8s	remaining: 13s
165:	learn: 0.0075305	total: 19.9s	remaining: 12.8s
166:	learn: 0.0074725	total: 20s	remaining: 12.7s
167:	learn: 0.0074220	total: 20.1s	remaining: 12.6s
168:	learn: 0.0073593	total: 20.2s	remaining: 12.4s
169:	learn: 0.0072848	total: 20.3s	remaining: 12.3s
170:	learn: 0.0072720	total: 20.4s	remaining: 12.2s
171:	learn: 0.0072332	total: 20.5s	remaining: 12s
172:	learn: 0.0072012	total: 20.6s	remaining: 11.9s
173:	learn: 0.0071785	total: 20.7s	remaining: 11.8s
174:	learn: 0.0071080	total: 20.7s	remaining: 11.6s
175:	learn: 0.0070144	total: 20.9s	remaining: 11.5s
176:	learn: 0.0069778	total: 20.9s	remaining: 11.4s
177:	learn: 0.0069269	total: 21s	remaining: 11.2s
178:	learn: 0.0068858	total: 21.2s	remaining: 11.1s
179:	learn: 0.0068523	total: 21.2s	remaining: 11s
180:	learn: 0.0068289	total: 21.3s	remaining: 10.8s
181:	learn: 0.0067928	total: 21.4s	remaining: 10.7s
182:	learn: 0.0067313	total: 21.5s	remaining: 10.6s
183:	learn: 0.0066740	total: 21.6s	remaining: 10.5s
184:	learn: 0.0066400	total: 21.7s	remaining: 10.3s
185:	learn: 0.0065961	total: 21.8s	remaining: 10.2s
186:	learn: 0.0065363	total: 21.9s	remaining: 10.1s
187:	learn: 0.0065363	total: 22s	remaining: 9.95s
188:	learn: 0.0064646	total: 22.1s	remaining: 9.82s
189:	learn: 0.0064174	total: 22.2s	remaining: 9.69s
190:	learn: 0.0063824	total: 22.3s	remaining: 9.57s
191:	learn: 0.0063215	total: 22.4s	remaining: 9.45s
192:	learn: 0.0062371	total: 22.5s	remaining: 9.32s
193:	learn: 0.0062087	total: 22.6s	remaining: 9.2s
194:	learn: 0.0061594	total: 22.7s	remaining: 9.07s
195:	learn: 0.0060966	total: 22.8s	remaining: 8.94s
196:	learn: 0.0060600	total: 22.9s	remaining: 8.82s
197:	learn: 0.0060115	total: 23s	remaining: 8.7s
198:	learn: 0.0059957	total: 23s	remaining: 8.56s
199:	learn: 0.0059489	total: 23.2s	remaining: 8.45s
200:	learn: 0.0059333	total: 23.2s	remaining: 8.32s
201:	learn: 0.0058571	total: 23.3s	remaining: 8.2s
202:	learn: 0.0058299	total: 23.4s	remaining: 8.08s
203:	learn: 0.0057953	total: 23.5s	remaining: 7.95s
204:	learn: 0.0057625	total: 23.6s	remaining: 7.83s
205:	learn: 0.0056988	total: 23.7s	remaining: 7.71s
206:	learn: 0.0056409	total: 23.8s	remaining: 7.59s
207:	learn: 0.0056240	total: 23.9s	remaining: 7.46s
208:	learn: 0.0055740	total: 24s	remaining: 7.35s
209:	learn: 0.0055170	total: 24.1s	remaining: 7.23s
210:	learn: 0.0054851	total: 24.2s	remaining: 7.11s
211:	learn: 0.0054793	total: 24.3s	remaining: 6.99s
212:	learn: 0.0054370	total: 24.4s	remaining: 6.87s
213:	learn: 0.0054203	total: 24.5s	remaining: 6.74s
214:	learn: 0.0054203	total: 24.5s	remaining: 6.62s
215:	learn: 0.0053534	total: 24.6s	remaining: 6.5s
216:	learn: 0.0053437	total: 24.7s	remaining: 6.38s
217:	learn: 0.0053221	total: 24.8s	remaining: 6.26s
218:	learn: 0.0052930	total: 24.9s	remaining: 6.14s
219:	learn: 0.0052684	total: 25s	remaining: 6.02s
220:	learn: 0.0052249	total: 25.1s	remaining: 5.91s
221:	learn: 0.0051767	total: 25.2s	remaining: 5.79s
222:	learn: 0.0051258	total: 25.3s	remaining: 5.67s
223:	learn: 0.0051012	total: 25.4s	remaining: 5.55s
224:	learn: 0.0050601	total: 25.5s	remaining: 5.43s
225:	learn: 0.0050469	total: 25.6s	remaining: 5.32s
226:	learn: 0.0050014	total: 25.7s	remaining: 5.2s
227:	learn: 0.0049741	total: 25.8s	remaining: 5.08s
228:	learn: 0.0049682	total: 25.8s	remaining: 4.96s
229:	learn: 0.0049507	total: 25.9s	remaining: 4.85s
230:	learn: 0.0048947	total: 26.1s	remaining: 4.74s
231:	learn: 0.0048751	total: 26.2s	remaining: 4.62s
232:	learn: 0.0048583	total: 26.3s	remaining: 4.51s
233:	learn: 0.0048457	total: 26.4s	remaining: 4.39s
234:	learn: 0.0048224	total: 26.5s	remaining: 4.28s
235:	learn: 0.0047306	total: 26.6s	remaining: 4.16s
236:	learn: 0.0047121	total: 26.6s	remaining: 4.05s
237:	learn: 0.0046891	total: 26.7s	remaining: 3.93s
238:	learn: 0.0046604	total: 26.8s	remaining: 3.82s
239:	learn: 0.0045937	total: 26.9s	remaining: 3.7s
240:	learn: 0.0045361	total: 27s	remaining: 3.59s
241:	learn: 0.0045258	total: 27.1s	remaining: 3.47s
242:	learn: 0.0044944	total: 27.2s	remaining: 3.36s
243:	learn: 0.0044944	total: 27.3s	remaining: 3.24s
244:	learn: 0.0044530	total: 27.4s	remaining: 3.13s
245:	learn: 0.0044147	total: 27.5s	remaining: 3.02s
246:	learn: 0.0043941	total: 27.6s	remaining: 2.9s
247:	learn: 0.0043892	total: 27.7s	remaining: 2.79s
248:	learn: 0.0043401	total: 27.8s	remaining: 2.67s
249:	learn: 0.0043293	total: 27.9s	remaining: 2.56s
250:	learn: 0.0042915	total: 28s	remaining: 2.46s
251:	learn: 0.0042628	total: 28.2s	remaining: 2.35s
252:	learn: 0.0042428	total: 28.4s	remaining: 2.24s
253:	learn: 0.0042041	total: 28.6s	remaining: 2.14s
254:	learn: 0.0041966	total: 28.7s	remaining: 2.03s
255:	learn: 0.0041780	total: 28.9s	remaining: 1.92s
256:	learn: 0.0041646	total: 29.1s	remaining: 1.81s
257:	learn: 0.0041232	total: 29.2s	remaining: 1.7s
258:	learn: 0.0041061	total: 29.4s	remaining: 1.59s
259:	learn: 0.0040846	total: 29.6s	remaining: 1.48s
260:	learn: 0.0040752	total: 29.7s	remaining: 1.37s
261:	learn: 0.0040461	total: 29.9s	remaining: 1.25s
262:	learn: 0.0040247	total: 30.1s	remaining: 1.14s
263:	learn: 0.0039944	total: 30.2s	remaining: 1.03s
264:	learn: 0.0039680	total: 30.4s	remaining: 918ms
265:	learn: 0.0039376	total: 30.6s	remaining: 805ms
266:	learn: 0.0039054	total: 30.8s	remaining: 691ms
267:	learn: 0.0038739	total: 30.9s	remaining: 577ms
268:	learn: 0.0038566	total: 31.1s	remaining: 463ms
269:	learn: 0.0038253	total: 31.3s	remaining: 348ms
270:	learn: 0.0038122	total: 31.4s	remaining: 232ms
271:	learn: 0.0038017	total: 31.6s	remaining: 116ms
272:	learn: 0.0037879	total: 31.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.64
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.82
 - Precision_Test: 27.07
 - Recall_Test: 88.10
 - AUPRC_Test: 80.60
 - Accuracy_Test: 99.58
 - F1-Score_Test: 41.42
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 273
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.08
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5705567	total: 83ms	remaining: 22.6s
1:	learn: 0.4495567	total: 168ms	remaining: 22.8s
2:	learn: 0.3681851	total: 255ms	remaining: 23s
3:	learn: 0.3174487	total: 363ms	remaining: 24.4s
4:	learn: 0.2720791	total: 446ms	remaining: 23.9s
5:	learn: 0.2327869	total: 529ms	remaining: 23.6s
6:	learn: 0.2063988	total: 636ms	remaining: 24.2s
7:	learn: 0.1818178	total: 720ms	remaining: 23.8s
8:	learn: 0.1625742	total: 798ms	remaining: 23.4s
9:	learn: 0.1492316	total: 923ms	remaining: 24.3s
10:	learn: 0.1358128	total: 1s	remaining: 23.9s
11:	learn: 0.1274051	total: 1.08s	remaining: 23.5s
12:	learn: 0.1161945	total: 1.19s	remaining: 23.8s
13:	learn: 0.1079529	total: 1.27s	remaining: 23.6s
14:	learn: 0.1025490	total: 1.36s	remaining: 23.3s
15:	learn: 0.0952901	total: 1.47s	remaining: 23.6s
16:	learn: 0.0890394	total: 1.55s	remaining: 23.4s
17:	learn: 0.0848339	total: 1.64s	remaining: 23.3s
18:	learn: 0.0809553	total: 1.77s	remaining: 23.6s
19:	learn: 0.0764765	total: 1.86s	remaining: 23.5s
20:	learn: 0.0726862	total: 1.96s	remaining: 23.5s
21:	learn: 0.0688532	total: 2.08s	remaining: 23.8s
22:	learn: 0.0646968	total: 2.17s	remaining: 23.6s
23:	learn: 0.0621806	total: 2.27s	remaining: 23.5s
24:	learn: 0.0592727	total: 2.38s	remaining: 23.6s
25:	learn: 0.0579266	total: 2.46s	remaining: 23.4s
26:	learn: 0.0562270	total: 2.55s	remaining: 23.2s
27:	learn: 0.0545718	total: 2.66s	remaining: 23.3s
28:	learn: 0.0529182	total: 2.75s	remaining: 23.1s
29:	learn: 0.0507493	total: 2.84s	remaining: 23s
30:	learn: 0.0497014	total: 2.99s	remaining: 23.3s
31:	learn: 0.0477228	total: 3.1s	remaining: 23.3s
32:	learn: 0.0461388	total: 3.2s	remaining: 23.3s
33:	learn: 0.0445882	total: 3.32s	remaining: 23.3s
34:	learn: 0.0433973	total: 3.41s	remaining: 23.2s
35:	learn: 0.0428497	total: 3.51s	remaining: 23.1s
36:	learn: 0.0415662	total: 3.6s	remaining: 23s
37:	learn: 0.0408482	total: 3.71s	remaining: 23s
38:	learn: 0.0401192	total: 3.8s	remaining: 22.8s
39:	learn: 0.0389277	total: 3.89s	remaining: 22.6s
40:	learn: 0.0379290	total: 4.01s	remaining: 22.7s
41:	learn: 0.0373572	total: 4.1s	remaining: 22.5s
42:	learn: 0.0367754	total: 4.18s	remaining: 22.4s
43:	learn: 0.0359064	total: 4.31s	remaining: 22.4s
44:	learn: 0.0350056	total: 4.39s	remaining: 22.2s
45:	learn: 0.0341585	total: 4.48s	remaining: 22.1s
46:	learn: 0.0337705	total: 4.58s	remaining: 22s
47:	learn: 0.0332922	total: 4.67s	remaining: 21.9s
48:	learn: 0.0324486	total: 4.76s	remaining: 21.8s
49:	learn: 0.0317283	total: 4.88s	remaining: 21.8s
50:	learn: 0.0309339	total: 4.97s	remaining: 21.6s
51:	learn: 0.0301298	total: 5.08s	remaining: 21.6s
52:	learn: 0.0292043	total: 5.2s	remaining: 21.6s
53:	learn: 0.0287081	total: 5.29s	remaining: 21.4s
54:	learn: 0.0278478	total: 5.38s	remaining: 21.3s
55:	learn: 0.0272219	total: 5.5s	remaining: 21.3s
56:	learn: 0.0265934	total: 5.58s	remaining: 21.2s
57:	learn: 0.0259418	total: 5.68s	remaining: 21.1s
58:	learn: 0.0255396	total: 5.79s	remaining: 21s
59:	learn: 0.0252282	total: 5.87s	remaining: 20.9s
60:	learn: 0.0248578	total: 5.96s	remaining: 20.7s
61:	learn: 0.0244211	total: 6.09s	remaining: 20.7s
62:	learn: 0.0238814	total: 6.18s	remaining: 20.6s
63:	learn: 0.0232330	total: 6.27s	remaining: 20.5s
64:	learn: 0.0228455	total: 6.39s	remaining: 20.4s
65:	learn: 0.0224927	total: 6.48s	remaining: 20.3s
66:	learn: 0.0220461	total: 6.57s	remaining: 20.2s
67:	learn: 0.0216159	total: 6.69s	remaining: 20.2s
68:	learn: 0.0211328	total: 6.78s	remaining: 20s
69:	learn: 0.0206429	total: 6.87s	remaining: 19.9s
70:	learn: 0.0202854	total: 6.99s	remaining: 19.9s
71:	learn: 0.0199121	total: 7.09s	remaining: 19.8s
72:	learn: 0.0197342	total: 7.18s	remaining: 19.7s
73:	learn: 0.0193206	total: 7.29s	remaining: 19.6s
74:	learn: 0.0189962	total: 7.38s	remaining: 19.5s
75:	learn: 0.0186402	total: 7.46s	remaining: 19.3s
76:	learn: 0.0183796	total: 7.59s	remaining: 19.3s
77:	learn: 0.0181440	total: 7.7s	remaining: 19.2s
78:	learn: 0.0178537	total: 7.8s	remaining: 19.2s
79:	learn: 0.0174353	total: 7.91s	remaining: 19.1s
80:	learn: 0.0171955	total: 8s	remaining: 19s
81:	learn: 0.0169179	total: 8.14s	remaining: 18.9s
82:	learn: 0.0166150	total: 8.22s	remaining: 18.8s
83:	learn: 0.0163900	total: 8.32s	remaining: 18.7s
84:	learn: 0.0162493	total: 8.43s	remaining: 18.6s
85:	learn: 0.0160736	total: 8.52s	remaining: 18.5s
86:	learn: 0.0157792	total: 8.61s	remaining: 18.4s
87:	learn: 0.0155145	total: 8.73s	remaining: 18.4s
88:	learn: 0.0153937	total: 8.81s	remaining: 18.2s
89:	learn: 0.0151624	total: 8.89s	remaining: 18.1s
90:	learn: 0.0150423	total: 9.02s	remaining: 18s
91:	learn: 0.0147652	total: 9.18s	remaining: 18.1s
92:	learn: 0.0146735	total: 9.34s	remaining: 18.1s
93:	learn: 0.0144428	total: 9.46s	remaining: 18s
94:	learn: 0.0142818	total: 9.64s	remaining: 18.1s
95:	learn: 0.0140869	total: 9.82s	remaining: 18.1s
96:	learn: 0.0137567	total: 10s	remaining: 18.2s
97:	learn: 0.0135430	total: 10.2s	remaining: 18.2s
98:	learn: 0.0133393	total: 10.4s	remaining: 18.3s
99:	learn: 0.0131549	total: 10.6s	remaining: 18.3s
100:	learn: 0.0129526	total: 10.8s	remaining: 18.3s
101:	learn: 0.0127078	total: 10.9s	remaining: 18.3s
102:	learn: 0.0126336	total: 11.1s	remaining: 18.3s
103:	learn: 0.0124931	total: 11.3s	remaining: 18.4s
104:	learn: 0.0123345	total: 11.5s	remaining: 18.4s
105:	learn: 0.0121614	total: 11.7s	remaining: 18.4s
106:	learn: 0.0120479	total: 11.9s	remaining: 18.4s
107:	learn: 0.0118356	total: 12s	remaining: 18.4s
108:	learn: 0.0116542	total: 12.2s	remaining: 18.4s
109:	learn: 0.0114707	total: 12.4s	remaining: 18.4s
110:	learn: 0.0113094	total: 12.6s	remaining: 18.4s
111:	learn: 0.0111714	total: 12.8s	remaining: 18.4s
112:	learn: 0.0110985	total: 12.9s	remaining: 18.3s
113:	learn: 0.0110256	total: 13.1s	remaining: 18.3s
114:	learn: 0.0109251	total: 13.3s	remaining: 18.2s
115:	learn: 0.0107598	total: 13.4s	remaining: 18.2s
116:	learn: 0.0106324	total: 13.6s	remaining: 18.1s
117:	learn: 0.0104645	total: 13.8s	remaining: 18.1s
118:	learn: 0.0103439	total: 14s	remaining: 18.1s
119:	learn: 0.0102504	total: 14.1s	remaining: 18s
120:	learn: 0.0101525	total: 14.3s	remaining: 17.9s
121:	learn: 0.0100100	total: 14.5s	remaining: 17.9s
122:	learn: 0.0099675	total: 14.7s	remaining: 17.9s
123:	learn: 0.0098264	total: 14.8s	remaining: 17.8s
124:	learn: 0.0096422	total: 15s	remaining: 17.8s
125:	learn: 0.0095232	total: 15.2s	remaining: 17.7s
126:	learn: 0.0093983	total: 15.4s	remaining: 17.7s
127:	learn: 0.0092725	total: 15.6s	remaining: 17.6s
128:	learn: 0.0091746	total: 15.7s	remaining: 17.6s
129:	learn: 0.0091408	total: 15.9s	remaining: 17.5s
130:	learn: 0.0090002	total: 16s	remaining: 17.4s
131:	learn: 0.0088884	total: 16.1s	remaining: 17.2s
132:	learn: 0.0087592	total: 16.2s	remaining: 17.1s
133:	learn: 0.0086319	total: 16.3s	remaining: 16.9s
134:	learn: 0.0085512	total: 16.4s	remaining: 16.8s
135:	learn: 0.0084610	total: 16.5s	remaining: 16.6s
136:	learn: 0.0083984	total: 16.6s	remaining: 16.5s
137:	learn: 0.0083131	total: 16.7s	remaining: 16.3s
138:	learn: 0.0082482	total: 16.8s	remaining: 16.2s
139:	learn: 0.0081530	total: 16.9s	remaining: 16.1s
140:	learn: 0.0080968	total: 17s	remaining: 15.9s
141:	learn: 0.0080659	total: 17.1s	remaining: 15.8s
142:	learn: 0.0079943	total: 17.2s	remaining: 15.6s
143:	learn: 0.0079459	total: 17.3s	remaining: 15.5s
144:	learn: 0.0078575	total: 17.4s	remaining: 15.4s
145:	learn: 0.0077477	total: 17.5s	remaining: 15.2s
146:	learn: 0.0076828	total: 17.6s	remaining: 15.1s
147:	learn: 0.0076470	total: 17.7s	remaining: 15s
148:	learn: 0.0075311	total: 17.8s	remaining: 14.8s
149:	learn: 0.0074620	total: 17.9s	remaining: 14.7s
150:	learn: 0.0073617	total: 18s	remaining: 14.5s
151:	learn: 0.0072852	total: 18.1s	remaining: 14.4s
152:	learn: 0.0072103	total: 18.2s	remaining: 14.3s
153:	learn: 0.0071241	total: 18.3s	remaining: 14.1s
154:	learn: 0.0070905	total: 18.4s	remaining: 14s
155:	learn: 0.0070389	total: 18.5s	remaining: 13.9s
156:	learn: 0.0069653	total: 18.6s	remaining: 13.7s
157:	learn: 0.0068452	total: 18.7s	remaining: 13.6s
158:	learn: 0.0067646	total: 18.8s	remaining: 13.5s
159:	learn: 0.0067482	total: 18.9s	remaining: 13.4s
160:	learn: 0.0066884	total: 19s	remaining: 13.2s
161:	learn: 0.0066040	total: 19.1s	remaining: 13.1s
162:	learn: 0.0065437	total: 19.2s	remaining: 13s
163:	learn: 0.0064518	total: 19.3s	remaining: 12.8s
164:	learn: 0.0063931	total: 19.4s	remaining: 12.7s
165:	learn: 0.0063453	total: 19.5s	remaining: 12.6s
166:	learn: 0.0062528	total: 19.6s	remaining: 12.4s
167:	learn: 0.0061440	total: 19.7s	remaining: 12.3s
168:	learn: 0.0060752	total: 19.8s	remaining: 12.2s
169:	learn: 0.0060465	total: 19.9s	remaining: 12.1s
170:	learn: 0.0059691	total: 20s	remaining: 11.9s
171:	learn: 0.0059402	total: 20.1s	remaining: 11.8s
172:	learn: 0.0058849	total: 20.2s	remaining: 11.7s
173:	learn: 0.0058344	total: 20.3s	remaining: 11.6s
174:	learn: 0.0058141	total: 20.4s	remaining: 11.4s
175:	learn: 0.0057507	total: 20.5s	remaining: 11.3s
176:	learn: 0.0056652	total: 20.6s	remaining: 11.2s
177:	learn: 0.0056127	total: 20.7s	remaining: 11s
178:	learn: 0.0055587	total: 20.8s	remaining: 10.9s
179:	learn: 0.0054916	total: 20.9s	remaining: 10.8s
180:	learn: 0.0054119	total: 21s	remaining: 10.7s
181:	learn: 0.0053734	total: 21.1s	remaining: 10.5s
182:	learn: 0.0053470	total: 21.2s	remaining: 10.4s
183:	learn: 0.0053091	total: 21.3s	remaining: 10.3s
184:	learn: 0.0052929	total: 21.4s	remaining: 10.2s
185:	learn: 0.0052437	total: 21.5s	remaining: 10s
186:	learn: 0.0052207	total: 21.6s	remaining: 9.92s
187:	learn: 0.0052086	total: 21.7s	remaining: 9.79s
188:	learn: 0.0051878	total: 21.8s	remaining: 9.68s
189:	learn: 0.0051878	total: 21.8s	remaining: 9.54s
190:	learn: 0.0051460	total: 21.9s	remaining: 9.41s
191:	learn: 0.0051084	total: 22s	remaining: 9.3s
192:	learn: 0.0050980	total: 22.2s	remaining: 9.18s
193:	learn: 0.0050714	total: 22.2s	remaining: 9.06s
194:	learn: 0.0050443	total: 22.4s	remaining: 8.94s
195:	learn: 0.0049828	total: 22.4s	remaining: 8.81s
196:	learn: 0.0049461	total: 22.5s	remaining: 8.69s
197:	learn: 0.0049223	total: 22.6s	remaining: 8.57s
198:	learn: 0.0048666	total: 22.7s	remaining: 8.45s
199:	learn: 0.0048533	total: 22.8s	remaining: 8.33s
200:	learn: 0.0047873	total: 22.9s	remaining: 8.21s
201:	learn: 0.0047020	total: 23s	remaining: 8.09s
202:	learn: 0.0046716	total: 23.1s	remaining: 7.97s
203:	learn: 0.0046595	total: 23.2s	remaining: 7.85s
204:	learn: 0.0046282	total: 23.3s	remaining: 7.73s
205:	learn: 0.0046021	total: 23.4s	remaining: 7.6s
206:	learn: 0.0045702	total: 23.5s	remaining: 7.49s
207:	learn: 0.0045296	total: 23.6s	remaining: 7.37s
208:	learn: 0.0045098	total: 23.7s	remaining: 7.25s
209:	learn: 0.0045098	total: 23.8s	remaining: 7.13s
210:	learn: 0.0044751	total: 23.9s	remaining: 7.01s
211:	learn: 0.0044630	total: 23.9s	remaining: 6.89s
212:	learn: 0.0044291	total: 24.1s	remaining: 6.78s
213:	learn: 0.0043776	total: 24.1s	remaining: 6.66s
214:	learn: 0.0043132	total: 24.2s	remaining: 6.54s
215:	learn: 0.0042966	total: 24.3s	remaining: 6.42s
216:	learn: 0.0042598	total: 24.4s	remaining: 6.31s
217:	learn: 0.0042598	total: 24.5s	remaining: 6.18s
218:	learn: 0.0042450	total: 24.6s	remaining: 6.07s
219:	learn: 0.0042248	total: 24.7s	remaining: 5.95s
220:	learn: 0.0042109	total: 24.8s	remaining: 5.83s
221:	learn: 0.0041688	total: 24.9s	remaining: 5.72s
222:	learn: 0.0041375	total: 25s	remaining: 5.61s
223:	learn: 0.0040924	total: 25.1s	remaining: 5.49s
224:	learn: 0.0040490	total: 25.2s	remaining: 5.38s
225:	learn: 0.0040490	total: 25.3s	remaining: 5.26s
226:	learn: 0.0040263	total: 25.4s	remaining: 5.14s
227:	learn: 0.0040263	total: 25.5s	remaining: 5.02s
228:	learn: 0.0040022	total: 25.5s	remaining: 4.91s
229:	learn: 0.0039869	total: 25.6s	remaining: 4.79s
230:	learn: 0.0039438	total: 25.8s	remaining: 4.68s
231:	learn: 0.0039438	total: 25.8s	remaining: 4.56s
232:	learn: 0.0039001	total: 26s	remaining: 4.46s
233:	learn: 0.0038882	total: 26.1s	remaining: 4.35s
234:	learn: 0.0038882	total: 26.2s	remaining: 4.24s
235:	learn: 0.0038736	total: 26.4s	remaining: 4.13s
236:	learn: 0.0038521	total: 26.5s	remaining: 4.03s
237:	learn: 0.0038429	total: 26.7s	remaining: 3.93s
238:	learn: 0.0038025	total: 26.9s	remaining: 3.83s
239:	learn: 0.0037708	total: 27.1s	remaining: 3.72s
240:	learn: 0.0037628	total: 27.2s	remaining: 3.62s
241:	learn: 0.0037561	total: 27.4s	remaining: 3.51s
242:	learn: 0.0037299	total: 27.6s	remaining: 3.4s
243:	learn: 0.0036822	total: 27.7s	remaining: 3.3s
244:	learn: 0.0036577	total: 27.9s	remaining: 3.19s
245:	learn: 0.0036453	total: 28s	remaining: 3.08s
246:	learn: 0.0036278	total: 28.2s	remaining: 2.97s
247:	learn: 0.0035862	total: 28.4s	remaining: 2.86s
248:	learn: 0.0035696	total: 28.6s	remaining: 2.75s
249:	learn: 0.0035538	total: 28.8s	remaining: 2.65s
250:	learn: 0.0035538	total: 28.8s	remaining: 2.53s
251:	learn: 0.0035344	total: 29s	remaining: 2.42s
252:	learn: 0.0035043	total: 29.2s	remaining: 2.3s
253:	learn: 0.0035043	total: 29.3s	remaining: 2.19s
254:	learn: 0.0034734	total: 29.5s	remaining: 2.08s
255:	learn: 0.0034543	total: 29.7s	remaining: 1.97s
256:	learn: 0.0034542	total: 29.8s	remaining: 1.85s
257:	learn: 0.0034472	total: 29.9s	remaining: 1.74s
258:	learn: 0.0034246	total: 30.1s	remaining: 1.62s
259:	learn: 0.0034056	total: 30.3s	remaining: 1.51s
260:	learn: 0.0033815	total: 30.4s	remaining: 1.4s
261:	learn: 0.0033677	total: 30.6s	remaining: 1.28s
262:	learn: 0.0033552	total: 30.7s	remaining: 1.17s
263:	learn: 0.0033254	total: 30.9s	remaining: 1.05s
264:	learn: 0.0033253	total: 31.1s	remaining: 938ms
265:	learn: 0.0032959	total: 31.2s	remaining: 822ms
266:	learn: 0.0032816	total: 31.4s	remaining: 706ms
267:	learn: 0.0032648	total: 31.6s	remaining: 590ms
268:	learn: 0.0032648	total: 31.7s	remaining: 472ms
269:	learn: 0.0032648	total: 31.9s	remaining: 354ms
270:	learn: 0.0032648	total: 32s	remaining: 236ms
271:	learn: 0.0032648	total: 32.1s	remaining: 118ms
272:	learn: 0.0032513	total: 32.3s	remaining: 0us
[I 2024-12-19 15:10:50,457] Trial 46 finished with value: 79.2970774657662 and parameters: {'learning_rate': 0.05325070684305075, 'max_depth': 6, 'n_estimators': 273, 'scale_pos_weight': 6.079585437760284}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 28.57
 - Recall_Test: 85.71
 - AUPRC_Test: 77.19
 - Accuracy_Test: 99.62
 - F1-Score_Test: 42.86
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 273
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.05
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.08
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 79.2971

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5907006	total: 94.2ms	remaining: 27.2s
1:	learn: 0.4833476	total: 187ms	remaining: 26.9s
2:	learn: 0.3934960	total: 278ms	remaining: 26.6s
3:	learn: 0.3280725	total: 399ms	remaining: 28.6s
4:	learn: 0.2757388	total: 494ms	remaining: 28.2s
5:	learn: 0.2381451	total: 599ms	remaining: 28.4s
6:	learn: 0.2071268	total: 724ms	remaining: 29.3s
7:	learn: 0.1779308	total: 817ms	remaining: 28.8s
8:	learn: 0.1578646	total: 902ms	remaining: 28.2s
9:	learn: 0.1402412	total: 1.02s	remaining: 28.5s
10:	learn: 0.1276789	total: 1.11s	remaining: 28.1s
11:	learn: 0.1171725	total: 1.2s	remaining: 27.9s
12:	learn: 0.1068808	total: 1.34s	remaining: 28.5s
13:	learn: 0.0996714	total: 1.43s	remaining: 28.1s
14:	learn: 0.0939620	total: 1.52s	remaining: 27.8s
15:	learn: 0.0885710	total: 1.66s	remaining: 28.4s
16:	learn: 0.0850130	total: 1.75s	remaining: 28.2s
17:	learn: 0.0816458	total: 1.84s	remaining: 27.8s
18:	learn: 0.0761128	total: 1.96s	remaining: 27.9s
19:	learn: 0.0730499	total: 2.05s	remaining: 27.6s
20:	learn: 0.0702297	total: 2.14s	remaining: 27.4s
21:	learn: 0.0672154	total: 2.25s	remaining: 27.5s
22:	learn: 0.0649249	total: 2.34s	remaining: 27.1s
23:	learn: 0.0625911	total: 2.43s	remaining: 26.9s
24:	learn: 0.0590082	total: 2.54s	remaining: 26.9s
25:	learn: 0.0562464	total: 2.65s	remaining: 26.9s
26:	learn: 0.0546391	total: 2.74s	remaining: 26.7s
27:	learn: 0.0521000	total: 2.86s	remaining: 26.8s
28:	learn: 0.0503361	total: 2.96s	remaining: 26.6s
29:	learn: 0.0483897	total: 3.11s	remaining: 26.9s
30:	learn: 0.0470985	total: 3.22s	remaining: 26.9s
31:	learn: 0.0454716	total: 3.31s	remaining: 26.7s
32:	learn: 0.0439217	total: 3.4s	remaining: 26.4s
33:	learn: 0.0424851	total: 3.51s	remaining: 26.4s
34:	learn: 0.0408337	total: 3.6s	remaining: 26.2s
35:	learn: 0.0395222	total: 3.71s	remaining: 26.2s
36:	learn: 0.0381188	total: 3.83s	remaining: 26.2s
37:	learn: 0.0372327	total: 3.91s	remaining: 26s
38:	learn: 0.0364967	total: 4.03s	remaining: 25.9s
39:	learn: 0.0356982	total: 4.13s	remaining: 25.8s
40:	learn: 0.0346219	total: 4.22s	remaining: 25.6s
41:	learn: 0.0337818	total: 4.34s	remaining: 25.6s
42:	learn: 0.0330300	total: 4.42s	remaining: 25.4s
43:	learn: 0.0324096	total: 4.51s	remaining: 25.2s
44:	learn: 0.0318323	total: 4.64s	remaining: 25.3s
45:	learn: 0.0313260	total: 4.75s	remaining: 25.2s
46:	learn: 0.0307522	total: 4.83s	remaining: 25s
47:	learn: 0.0300949	total: 4.94s	remaining: 24.9s
48:	learn: 0.0294824	total: 5.03s	remaining: 24.8s
49:	learn: 0.0288018	total: 5.12s	remaining: 24.6s
50:	learn: 0.0283006	total: 5.24s	remaining: 24.6s
51:	learn: 0.0277757	total: 5.34s	remaining: 24.4s
52:	learn: 0.0272591	total: 5.43s	remaining: 24.3s
53:	learn: 0.0269836	total: 5.54s	remaining: 24.2s
54:	learn: 0.0264151	total: 5.63s	remaining: 24.1s
55:	learn: 0.0259832	total: 5.74s	remaining: 24s
56:	learn: 0.0254735	total: 5.85s	remaining: 23.9s
57:	learn: 0.0249457	total: 5.93s	remaining: 23.7s
58:	learn: 0.0245167	total: 6.02s	remaining: 23.6s
59:	learn: 0.0241579	total: 6.13s	remaining: 23.5s
60:	learn: 0.0237562	total: 6.22s	remaining: 23.4s
61:	learn: 0.0233815	total: 6.3s	remaining: 23.2s
62:	learn: 0.0228455	total: 6.42s	remaining: 23.1s
63:	learn: 0.0224892	total: 6.52s	remaining: 23s
64:	learn: 0.0221370	total: 6.62s	remaining: 22.9s
65:	learn: 0.0218504	total: 6.74s	remaining: 22.9s
66:	learn: 0.0214507	total: 6.83s	remaining: 22.7s
67:	learn: 0.0210367	total: 6.91s	remaining: 22.6s
68:	learn: 0.0207117	total: 7.03s	remaining: 22.5s
69:	learn: 0.0203811	total: 7.12s	remaining: 22.4s
70:	learn: 0.0200676	total: 7.2s	remaining: 22.2s
71:	learn: 0.0197685	total: 7.32s	remaining: 22.2s
72:	learn: 0.0193751	total: 7.41s	remaining: 22s
73:	learn: 0.0191375	total: 7.55s	remaining: 22s
74:	learn: 0.0189170	total: 7.71s	remaining: 22.1s
75:	learn: 0.0187016	total: 7.93s	remaining: 22.3s
76:	learn: 0.0183120	total: 8.11s	remaining: 22.4s
77:	learn: 0.0180114	total: 8.29s	remaining: 22.5s
78:	learn: 0.0178031	total: 8.46s	remaining: 22.6s
79:	learn: 0.0175420	total: 8.65s	remaining: 22.7s
80:	learn: 0.0172363	total: 8.83s	remaining: 22.8s
81:	learn: 0.0169945	total: 9s	remaining: 22.8s
82:	learn: 0.0167727	total: 9.18s	remaining: 22.9s
83:	learn: 0.0165836	total: 9.34s	remaining: 22.9s
84:	learn: 0.0163898	total: 9.48s	remaining: 22.9s
85:	learn: 0.0160535	total: 9.67s	remaining: 22.9s
86:	learn: 0.0158099	total: 9.84s	remaining: 23s
87:	learn: 0.0155814	total: 10s	remaining: 23s
88:	learn: 0.0153758	total: 10.2s	remaining: 23s
89:	learn: 0.0151338	total: 10.4s	remaining: 23s
90:	learn: 0.0149849	total: 10.5s	remaining: 23s
91:	learn: 0.0148309	total: 10.7s	remaining: 23s
92:	learn: 0.0146288	total: 10.9s	remaining: 23s
93:	learn: 0.0144464	total: 11s	remaining: 23s
94:	learn: 0.0142356	total: 11.2s	remaining: 23s
95:	learn: 0.0140520	total: 11.4s	remaining: 23s
96:	learn: 0.0138800	total: 11.6s	remaining: 23s
97:	learn: 0.0137462	total: 11.7s	remaining: 23s
98:	learn: 0.0135537	total: 11.9s	remaining: 22.9s
99:	learn: 0.0134558	total: 12s	remaining: 22.9s
100:	learn: 0.0132965	total: 12.2s	remaining: 22.9s
101:	learn: 0.0131156	total: 12.4s	remaining: 22.9s
102:	learn: 0.0129759	total: 12.6s	remaining: 22.9s
103:	learn: 0.0128628	total: 12.8s	remaining: 22.8s
104:	learn: 0.0127167	total: 12.9s	remaining: 22.8s
105:	learn: 0.0125506	total: 13.1s	remaining: 22.8s
106:	learn: 0.0124455	total: 13.3s	remaining: 22.8s
107:	learn: 0.0123196	total: 13.5s	remaining: 22.7s
108:	learn: 0.0121725	total: 13.5s	remaining: 22.5s
109:	learn: 0.0120033	total: 13.7s	remaining: 22.3s
110:	learn: 0.0119168	total: 13.8s	remaining: 22.2s
111:	learn: 0.0117520	total: 13.9s	remaining: 22.1s
112:	learn: 0.0115501	total: 14s	remaining: 21.9s
113:	learn: 0.0114065	total: 14.1s	remaining: 21.8s
114:	learn: 0.0113199	total: 14.2s	remaining: 21.6s
115:	learn: 0.0111612	total: 14.3s	remaining: 21.4s
116:	learn: 0.0110306	total: 14.4s	remaining: 21.3s
117:	learn: 0.0109152	total: 14.5s	remaining: 21.1s
118:	learn: 0.0107848	total: 14.6s	remaining: 21s
119:	learn: 0.0106448	total: 14.7s	remaining: 20.8s
120:	learn: 0.0105569	total: 14.8s	remaining: 20.7s
121:	learn: 0.0104493	total: 14.9s	remaining: 20.5s
122:	learn: 0.0103267	total: 15s	remaining: 20.4s
123:	learn: 0.0101676	total: 15.1s	remaining: 20.2s
124:	learn: 0.0100764	total: 15.2s	remaining: 20s
125:	learn: 0.0098921	total: 15.3s	remaining: 19.9s
126:	learn: 0.0097981	total: 15.4s	remaining: 19.8s
127:	learn: 0.0097090	total: 15.5s	remaining: 19.6s
128:	learn: 0.0096173	total: 15.6s	remaining: 19.5s
129:	learn: 0.0094422	total: 15.7s	remaining: 19.3s
130:	learn: 0.0092686	total: 15.8s	remaining: 19.2s
131:	learn: 0.0091606	total: 15.9s	remaining: 19s
132:	learn: 0.0090714	total: 16s	remaining: 18.9s
133:	learn: 0.0089397	total: 16.1s	remaining: 18.7s
134:	learn: 0.0088690	total: 16.2s	remaining: 18.6s
135:	learn: 0.0087737	total: 16.3s	remaining: 18.5s
136:	learn: 0.0087081	total: 16.4s	remaining: 18.3s
137:	learn: 0.0086180	total: 16.5s	remaining: 18.2s
138:	learn: 0.0085291	total: 16.6s	remaining: 18s
139:	learn: 0.0084541	total: 16.7s	remaining: 17.9s
140:	learn: 0.0083903	total: 16.8s	remaining: 17.7s
141:	learn: 0.0083473	total: 16.9s	remaining: 17.6s
142:	learn: 0.0081892	total: 17s	remaining: 17.4s
143:	learn: 0.0080869	total: 17.1s	remaining: 17.3s
144:	learn: 0.0080374	total: 17.2s	remaining: 17.2s
145:	learn: 0.0079470	total: 17.3s	remaining: 17s
146:	learn: 0.0078630	total: 17.4s	remaining: 16.9s
147:	learn: 0.0078085	total: 17.5s	remaining: 16.8s
148:	learn: 0.0077668	total: 17.6s	remaining: 16.7s
149:	learn: 0.0076956	total: 17.7s	remaining: 16.5s
150:	learn: 0.0076447	total: 17.8s	remaining: 16.4s
151:	learn: 0.0075840	total: 17.9s	remaining: 16.3s
152:	learn: 0.0075127	total: 18.1s	remaining: 16.2s
153:	learn: 0.0074748	total: 18.3s	remaining: 16.1s
154:	learn: 0.0074100	total: 18.4s	remaining: 16s
155:	learn: 0.0073447	total: 18.6s	remaining: 16s
156:	learn: 0.0072919	total: 18.8s	remaining: 15.9s
157:	learn: 0.0072244	total: 19s	remaining: 15.8s
158:	learn: 0.0071630	total: 19.2s	remaining: 15.8s
159:	learn: 0.0071091	total: 19.3s	remaining: 15.7s
160:	learn: 0.0070351	total: 19.5s	remaining: 15.6s
161:	learn: 0.0069775	total: 19.7s	remaining: 15.6s
162:	learn: 0.0069451	total: 19.9s	remaining: 15.5s
163:	learn: 0.0068558	total: 20s	remaining: 15.4s
164:	learn: 0.0068100	total: 20.2s	remaining: 15.3s
165:	learn: 0.0067595	total: 20.4s	remaining: 15.2s
166:	learn: 0.0066963	total: 20.6s	remaining: 15.2s
167:	learn: 0.0066570	total: 20.7s	remaining: 15.1s
168:	learn: 0.0066121	total: 20.9s	remaining: 15s
169:	learn: 0.0065721	total: 21.1s	remaining: 14.9s
170:	learn: 0.0065544	total: 21.3s	remaining: 14.8s
171:	learn: 0.0065124	total: 21.5s	remaining: 14.7s
172:	learn: 0.0064776	total: 21.6s	remaining: 14.6s
173:	learn: 0.0064044	total: 21.8s	remaining: 14.5s
174:	learn: 0.0063575	total: 21.9s	remaining: 14.4s
175:	learn: 0.0063068	total: 22.1s	remaining: 14.3s
176:	learn: 0.0062735	total: 22.3s	remaining: 14.2s
177:	learn: 0.0062207	total: 22.5s	remaining: 14.1s
178:	learn: 0.0061657	total: 22.6s	remaining: 14s
179:	learn: 0.0061025	total: 22.8s	remaining: 13.9s
180:	learn: 0.0060489	total: 23s	remaining: 13.8s
181:	learn: 0.0060066	total: 23.1s	remaining: 13.7s
182:	learn: 0.0059512	total: 23.2s	remaining: 13.6s
183:	learn: 0.0058805	total: 23.3s	remaining: 13.4s
184:	learn: 0.0058473	total: 23.5s	remaining: 13.3s
185:	learn: 0.0058316	total: 23.6s	remaining: 13.2s
186:	learn: 0.0057723	total: 23.8s	remaining: 13.1s
187:	learn: 0.0056862	total: 24s	remaining: 13s
188:	learn: 0.0056192	total: 24.2s	remaining: 12.9s
189:	learn: 0.0055850	total: 24.3s	remaining: 12.8s
190:	learn: 0.0055582	total: 24.5s	remaining: 12.7s
191:	learn: 0.0055266	total: 24.7s	remaining: 12.6s
192:	learn: 0.0055038	total: 24.8s	remaining: 12.5s
193:	learn: 0.0054689	total: 25s	remaining: 12.4s
194:	learn: 0.0054488	total: 25.2s	remaining: 12.3s
195:	learn: 0.0053890	total: 25.3s	remaining: 12.2s
196:	learn: 0.0053840	total: 25.5s	remaining: 12s
197:	learn: 0.0053584	total: 25.6s	remaining: 11.9s
198:	learn: 0.0053262	total: 25.8s	remaining: 11.8s
199:	learn: 0.0052855	total: 26s	remaining: 11.7s
200:	learn: 0.0052659	total: 26.2s	remaining: 11.6s
201:	learn: 0.0052427	total: 26.3s	remaining: 11.5s
202:	learn: 0.0052193	total: 26.5s	remaining: 11.4s
203:	learn: 0.0051924	total: 26.7s	remaining: 11.3s
204:	learn: 0.0051411	total: 26.9s	remaining: 11.1s
205:	learn: 0.0051111	total: 27s	remaining: 11s
206:	learn: 0.0050805	total: 27.2s	remaining: 10.9s
207:	learn: 0.0050679	total: 27.4s	remaining: 10.8s
208:	learn: 0.0050578	total: 27.5s	remaining: 10.7s
209:	learn: 0.0050089	total: 27.7s	remaining: 10.5s
210:	learn: 0.0049821	total: 27.8s	remaining: 10.4s
211:	learn: 0.0049687	total: 28s	remaining: 10.3s
212:	learn: 0.0049267	total: 28.2s	remaining: 10.2s
213:	learn: 0.0048866	total: 28.4s	remaining: 10.1s
214:	learn: 0.0048447	total: 28.6s	remaining: 9.96s
215:	learn: 0.0048227	total: 28.7s	remaining: 9.84s
216:	learn: 0.0047999	total: 28.9s	remaining: 9.71s
217:	learn: 0.0047518	total: 29s	remaining: 9.59s
218:	learn: 0.0047274	total: 29.2s	remaining: 9.47s
219:	learn: 0.0046929	total: 29.4s	remaining: 9.35s
220:	learn: 0.0046929	total: 29.5s	remaining: 9.21s
221:	learn: 0.0046929	total: 29.6s	remaining: 9.05s
222:	learn: 0.0046571	total: 29.6s	remaining: 8.91s
223:	learn: 0.0046571	total: 29.7s	remaining: 8.76s
224:	learn: 0.0046210	total: 29.8s	remaining: 8.62s
225:	learn: 0.0045649	total: 29.9s	remaining: 8.47s
226:	learn: 0.0045332	total: 30s	remaining: 8.34s
227:	learn: 0.0045055	total: 30.1s	remaining: 8.19s
228:	learn: 0.0044699	total: 30.2s	remaining: 8.05s
229:	learn: 0.0044495	total: 30.3s	remaining: 7.92s
230:	learn: 0.0044346	total: 30.4s	remaining: 7.78s
231:	learn: 0.0044088	total: 30.5s	remaining: 7.63s
232:	learn: 0.0043584	total: 30.6s	remaining: 7.5s
233:	learn: 0.0043382	total: 30.7s	remaining: 7.35s
234:	learn: 0.0043130	total: 30.8s	remaining: 7.21s
235:	learn: 0.0042817	total: 30.9s	remaining: 7.08s
236:	learn: 0.0042817	total: 31s	remaining: 6.93s
237:	learn: 0.0042595	total: 31.1s	remaining: 6.79s
238:	learn: 0.0042595	total: 31.2s	remaining: 6.65s
239:	learn: 0.0042301	total: 31.3s	remaining: 6.51s
240:	learn: 0.0042092	total: 31.3s	remaining: 6.37s
241:	learn: 0.0041636	total: 31.5s	remaining: 6.24s
242:	learn: 0.0041636	total: 31.5s	remaining: 6.1s
243:	learn: 0.0041435	total: 31.6s	remaining: 5.96s
244:	learn: 0.0041189	total: 31.7s	remaining: 5.83s
245:	learn: 0.0041147	total: 31.8s	remaining: 5.69s
246:	learn: 0.0040985	total: 31.9s	remaining: 5.55s
247:	learn: 0.0040985	total: 32s	remaining: 5.42s
248:	learn: 0.0040872	total: 32.1s	remaining: 5.28s
249:	learn: 0.0040872	total: 32.1s	remaining: 5.14s
250:	learn: 0.0040792	total: 32.2s	remaining: 5.01s
251:	learn: 0.0040625	total: 32.3s	remaining: 4.88s
252:	learn: 0.0040404	total: 32.4s	remaining: 4.74s
253:	learn: 0.0040185	total: 32.5s	remaining: 4.61s
254:	learn: 0.0040115	total: 32.6s	remaining: 4.47s
255:	learn: 0.0039942	total: 32.7s	remaining: 4.34s
256:	learn: 0.0039681	total: 32.8s	remaining: 4.21s
257:	learn: 0.0039443	total: 32.9s	remaining: 4.08s
258:	learn: 0.0039363	total: 33s	remaining: 3.95s
259:	learn: 0.0039271	total: 33.1s	remaining: 3.82s
260:	learn: 0.0039271	total: 33.2s	remaining: 3.69s
261:	learn: 0.0039172	total: 33.2s	remaining: 3.55s
262:	learn: 0.0039035	total: 33.3s	remaining: 3.42s
263:	learn: 0.0038972	total: 33.4s	remaining: 3.29s
264:	learn: 0.0038815	total: 33.5s	remaining: 3.16s
265:	learn: 0.0038636	total: 33.6s	remaining: 3.03s
266:	learn: 0.0038519	total: 33.7s	remaining: 2.9s
267:	learn: 0.0038400	total: 33.8s	remaining: 2.77s
268:	learn: 0.0038091	total: 33.9s	remaining: 2.65s
269:	learn: 0.0038091	total: 34s	remaining: 2.52s
270:	learn: 0.0037714	total: 34.1s	remaining: 2.39s
271:	learn: 0.0037681	total: 34.2s	remaining: 2.26s
272:	learn: 0.0037422	total: 34.3s	remaining: 2.13s
273:	learn: 0.0036861	total: 34.4s	remaining: 2.01s
274:	learn: 0.0036669	total: 34.5s	remaining: 1.88s
275:	learn: 0.0036558	total: 34.6s	remaining: 1.75s
276:	learn: 0.0036558	total: 34.6s	remaining: 1.62s
277:	learn: 0.0036396	total: 34.7s	remaining: 1.5s
278:	learn: 0.0036341	total: 34.8s	remaining: 1.37s
279:	learn: 0.0036183	total: 34.9s	remaining: 1.25s
280:	learn: 0.0036182	total: 35s	remaining: 1.12s
281:	learn: 0.0036038	total: 35.1s	remaining: 996ms
282:	learn: 0.0036038	total: 35.2s	remaining: 870ms
283:	learn: 0.0035650	total: 35.3s	remaining: 746ms
284:	learn: 0.0035550	total: 35.4s	remaining: 621ms
285:	learn: 0.0035452	total: 35.5s	remaining: 496ms
286:	learn: 0.0035452	total: 35.5s	remaining: 372ms
287:	learn: 0.0035250	total: 35.6s	remaining: 247ms
288:	learn: 0.0034929	total: 35.7s	remaining: 124ms
289:	learn: 0.0034730	total: 35.8s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.70
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 25.71
 - Recall_Test: 86.51
 - AUPRC_Test: 79.76
 - Accuracy_Test: 99.56
 - F1-Score_Test: 39.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 290
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.57
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5902859	total: 93.5ms	remaining: 27s
1:	learn: 0.4909089	total: 203ms	remaining: 29.3s
2:	learn: 0.4174049	total: 350ms	remaining: 33.5s
3:	learn: 0.3551641	total: 448ms	remaining: 32s
4:	learn: 0.3025172	total: 540ms	remaining: 30.8s
5:	learn: 0.2651151	total: 652ms	remaining: 30.9s
6:	learn: 0.2335417	total: 741ms	remaining: 30s
7:	learn: 0.2056898	total: 831ms	remaining: 29.3s
8:	learn: 0.1859030	total: 957ms	remaining: 29.9s
9:	learn: 0.1686728	total: 1.05s	remaining: 29.5s
10:	learn: 0.1548234	total: 1.15s	remaining: 29.1s
11:	learn: 0.1442324	total: 1.28s	remaining: 29.6s
12:	learn: 0.1350547	total: 1.4s	remaining: 29.9s
13:	learn: 0.1282381	total: 1.58s	remaining: 31.2s
14:	learn: 0.1213496	total: 1.76s	remaining: 32.3s
15:	learn: 0.1155016	total: 1.95s	remaining: 33.4s
16:	learn: 0.1094197	total: 2.13s	remaining: 34.2s
17:	learn: 0.1045980	total: 2.3s	remaining: 34.7s
18:	learn: 0.0999770	total: 2.47s	remaining: 35.3s
19:	learn: 0.0963439	total: 2.66s	remaining: 35.9s
20:	learn: 0.0924680	total: 2.84s	remaining: 36.4s
21:	learn: 0.0885135	total: 3.02s	remaining: 36.8s
22:	learn: 0.0858675	total: 3.17s	remaining: 36.8s
23:	learn: 0.0810115	total: 3.4s	remaining: 37.7s
24:	learn: 0.0779750	total: 3.57s	remaining: 37.8s
25:	learn: 0.0755858	total: 3.77s	remaining: 38.3s
26:	learn: 0.0726916	total: 3.96s	remaining: 38.6s
27:	learn: 0.0705813	total: 4.16s	remaining: 39s
28:	learn: 0.0689301	total: 4.36s	remaining: 39.2s
29:	learn: 0.0671384	total: 4.53s	remaining: 39.3s
30:	learn: 0.0658299	total: 4.67s	remaining: 39.1s
31:	learn: 0.0634395	total: 4.87s	remaining: 39.3s
32:	learn: 0.0616817	total: 5.05s	remaining: 39.3s
33:	learn: 0.0599003	total: 5.25s	remaining: 39.5s
34:	learn: 0.0585877	total: 5.41s	remaining: 39.4s
35:	learn: 0.0574688	total: 5.59s	remaining: 39.5s
36:	learn: 0.0563273	total: 5.76s	remaining: 39.4s
37:	learn: 0.0550564	total: 5.94s	remaining: 39.4s
38:	learn: 0.0540865	total: 6.09s	remaining: 39.2s
39:	learn: 0.0527132	total: 6.25s	remaining: 39s
40:	learn: 0.0518054	total: 6.42s	remaining: 39s
41:	learn: 0.0506297	total: 6.62s	remaining: 39.1s
42:	learn: 0.0496024	total: 6.77s	remaining: 38.9s
43:	learn: 0.0483137	total: 6.91s	remaining: 38.7s
44:	learn: 0.0473702	total: 7s	remaining: 38.1s
45:	learn: 0.0465166	total: 7.1s	remaining: 37.7s
46:	learn: 0.0453121	total: 7.22s	remaining: 37.3s
47:	learn: 0.0443645	total: 7.31s	remaining: 36.8s
48:	learn: 0.0438097	total: 7.39s	remaining: 36.4s
49:	learn: 0.0431274	total: 7.51s	remaining: 36s
50:	learn: 0.0420878	total: 7.6s	remaining: 35.6s
51:	learn: 0.0411622	total: 7.69s	remaining: 35.2s
52:	learn: 0.0404766	total: 7.8s	remaining: 34.9s
53:	learn: 0.0396820	total: 7.88s	remaining: 34.5s
54:	learn: 0.0389423	total: 7.97s	remaining: 34.1s
55:	learn: 0.0380345	total: 8.09s	remaining: 33.8s
56:	learn: 0.0372684	total: 8.19s	remaining: 33.5s
57:	learn: 0.0366449	total: 8.27s	remaining: 33.1s
58:	learn: 0.0359433	total: 8.39s	remaining: 32.8s
59:	learn: 0.0351658	total: 8.48s	remaining: 32.5s
60:	learn: 0.0345143	total: 8.57s	remaining: 32.2s
61:	learn: 0.0338761	total: 8.71s	remaining: 32s
62:	learn: 0.0331237	total: 8.8s	remaining: 31.7s
63:	learn: 0.0324304	total: 8.88s	remaining: 31.4s
64:	learn: 0.0319709	total: 9s	remaining: 31.2s
65:	learn: 0.0314765	total: 9.09s	remaining: 30.8s
66:	learn: 0.0310317	total: 9.17s	remaining: 30.5s
67:	learn: 0.0306865	total: 9.31s	remaining: 30.4s
68:	learn: 0.0303039	total: 9.41s	remaining: 30.2s
69:	learn: 0.0297462	total: 9.5s	remaining: 29.9s
70:	learn: 0.0294182	total: 9.61s	remaining: 29.7s
71:	learn: 0.0291167	total: 9.71s	remaining: 29.4s
72:	learn: 0.0288340	total: 9.79s	remaining: 29.1s
73:	learn: 0.0283290	total: 9.91s	remaining: 28.9s
74:	learn: 0.0279308	total: 9.99s	remaining: 28.7s
75:	learn: 0.0276827	total: 10.1s	remaining: 28.4s
76:	learn: 0.0272090	total: 10.2s	remaining: 28.2s
77:	learn: 0.0267217	total: 10.3s	remaining: 27.9s
78:	learn: 0.0263008	total: 10.4s	remaining: 27.7s
79:	learn: 0.0259536	total: 10.5s	remaining: 27.5s
80:	learn: 0.0256508	total: 10.6s	remaining: 27.3s
81:	learn: 0.0252168	total: 10.6s	remaining: 27s
82:	learn: 0.0247666	total: 10.8s	remaining: 26.9s
83:	learn: 0.0245967	total: 10.9s	remaining: 26.6s
84:	learn: 0.0241369	total: 10.9s	remaining: 26.4s
85:	learn: 0.0237201	total: 11.1s	remaining: 26.2s
86:	learn: 0.0234856	total: 11.1s	remaining: 26s
87:	learn: 0.0232695	total: 11.2s	remaining: 25.8s
88:	learn: 0.0228905	total: 11.3s	remaining: 25.6s
89:	learn: 0.0224355	total: 11.4s	remaining: 25.4s
90:	learn: 0.0221327	total: 11.5s	remaining: 25.2s
91:	learn: 0.0219343	total: 11.6s	remaining: 25s
92:	learn: 0.0216028	total: 11.7s	remaining: 24.8s
93:	learn: 0.0214205	total: 11.8s	remaining: 24.7s
94:	learn: 0.0211922	total: 11.9s	remaining: 24.5s
95:	learn: 0.0209751	total: 12s	remaining: 24.3s
96:	learn: 0.0207625	total: 12.1s	remaining: 24.1s
97:	learn: 0.0204978	total: 12.2s	remaining: 24s
98:	learn: 0.0202793	total: 12.3s	remaining: 23.8s
99:	learn: 0.0199472	total: 12.4s	remaining: 23.6s
100:	learn: 0.0196546	total: 12.5s	remaining: 23.5s
101:	learn: 0.0194549	total: 12.6s	remaining: 23.3s
102:	learn: 0.0191959	total: 12.7s	remaining: 23.1s
103:	learn: 0.0190261	total: 12.8s	remaining: 23s
104:	learn: 0.0187192	total: 12.9s	remaining: 22.8s
105:	learn: 0.0185252	total: 13s	remaining: 22.6s
106:	learn: 0.0182753	total: 13.1s	remaining: 22.5s
107:	learn: 0.0179546	total: 13.2s	remaining: 22.3s
108:	learn: 0.0177647	total: 13.3s	remaining: 22.1s
109:	learn: 0.0175129	total: 13.4s	remaining: 22s
110:	learn: 0.0173482	total: 13.5s	remaining: 21.8s
111:	learn: 0.0171692	total: 13.6s	remaining: 21.6s
112:	learn: 0.0170223	total: 13.7s	remaining: 21.5s
113:	learn: 0.0168699	total: 13.8s	remaining: 21.3s
114:	learn: 0.0165898	total: 13.9s	remaining: 21.2s
115:	learn: 0.0164368	total: 14s	remaining: 21s
116:	learn: 0.0161772	total: 14.1s	remaining: 20.9s
117:	learn: 0.0159541	total: 14.2s	remaining: 20.7s
118:	learn: 0.0158236	total: 14.4s	remaining: 20.6s
119:	learn: 0.0156799	total: 14.4s	remaining: 20.5s
120:	learn: 0.0155449	total: 14.5s	remaining: 20.3s
121:	learn: 0.0153757	total: 14.6s	remaining: 20.2s
122:	learn: 0.0151714	total: 14.7s	remaining: 20s
123:	learn: 0.0150174	total: 14.8s	remaining: 19.8s
124:	learn: 0.0148764	total: 14.9s	remaining: 19.7s
125:	learn: 0.0146426	total: 15s	remaining: 19.6s
126:	learn: 0.0145418	total: 15.1s	remaining: 19.4s
127:	learn: 0.0143364	total: 15.2s	remaining: 19.3s
128:	learn: 0.0141226	total: 15.3s	remaining: 19.1s
129:	learn: 0.0139338	total: 15.4s	remaining: 18.9s
130:	learn: 0.0137760	total: 15.5s	remaining: 18.8s
131:	learn: 0.0136059	total: 15.6s	remaining: 18.7s
132:	learn: 0.0134748	total: 15.7s	remaining: 18.5s
133:	learn: 0.0133736	total: 15.8s	remaining: 18.4s
134:	learn: 0.0132806	total: 15.9s	remaining: 18.2s
135:	learn: 0.0131721	total: 16s	remaining: 18.1s
136:	learn: 0.0130136	total: 16.1s	remaining: 18s
137:	learn: 0.0128522	total: 16.2s	remaining: 17.8s
138:	learn: 0.0126920	total: 16.3s	remaining: 17.7s
139:	learn: 0.0125843	total: 16.4s	remaining: 17.6s
140:	learn: 0.0124812	total: 16.5s	remaining: 17.4s
141:	learn: 0.0124300	total: 16.6s	remaining: 17.3s
142:	learn: 0.0123679	total: 16.7s	remaining: 17.1s
143:	learn: 0.0121802	total: 16.8s	remaining: 17s
144:	learn: 0.0120937	total: 16.9s	remaining: 16.9s
145:	learn: 0.0120429	total: 17s	remaining: 16.8s
146:	learn: 0.0119837	total: 17.2s	remaining: 16.7s
147:	learn: 0.0118426	total: 17.3s	remaining: 16.6s
148:	learn: 0.0116679	total: 17.5s	remaining: 16.6s
149:	learn: 0.0115816	total: 17.7s	remaining: 16.5s
150:	learn: 0.0115045	total: 17.8s	remaining: 16.4s
151:	learn: 0.0113613	total: 18s	remaining: 16.4s
152:	learn: 0.0112660	total: 18.2s	remaining: 16.3s
153:	learn: 0.0112259	total: 18.4s	remaining: 16.2s
154:	learn: 0.0111343	total: 18.6s	remaining: 16.2s
155:	learn: 0.0110516	total: 18.7s	remaining: 16.1s
156:	learn: 0.0109473	total: 18.9s	remaining: 16s
157:	learn: 0.0108217	total: 19.1s	remaining: 15.9s
158:	learn: 0.0107414	total: 19.2s	remaining: 15.9s
159:	learn: 0.0105918	total: 19.4s	remaining: 15.8s
160:	learn: 0.0105231	total: 19.6s	remaining: 15.7s
161:	learn: 0.0104577	total: 19.7s	remaining: 15.6s
162:	learn: 0.0103601	total: 19.9s	remaining: 15.5s
163:	learn: 0.0103290	total: 20.1s	remaining: 15.4s
164:	learn: 0.0102992	total: 20.3s	remaining: 15.4s
165:	learn: 0.0102196	total: 20.5s	remaining: 15.3s
166:	learn: 0.0101398	total: 20.6s	remaining: 15.2s
167:	learn: 0.0100176	total: 20.8s	remaining: 15.1s
168:	learn: 0.0099519	total: 21s	remaining: 15s
169:	learn: 0.0098563	total: 21.2s	remaining: 14.9s
170:	learn: 0.0097736	total: 21.4s	remaining: 14.9s
171:	learn: 0.0096518	total: 21.6s	remaining: 14.8s
172:	learn: 0.0095452	total: 21.8s	remaining: 14.7s
173:	learn: 0.0094695	total: 22s	remaining: 14.6s
174:	learn: 0.0093457	total: 22.1s	remaining: 14.5s
175:	learn: 0.0092210	total: 22.3s	remaining: 14.5s
176:	learn: 0.0090933	total: 22.5s	remaining: 14.4s
177:	learn: 0.0090335	total: 22.7s	remaining: 14.3s
178:	learn: 0.0090182	total: 22.9s	remaining: 14.2s
179:	learn: 0.0089455	total: 23s	remaining: 14.1s
180:	learn: 0.0089169	total: 23.2s	remaining: 14s
181:	learn: 0.0088397	total: 23.3s	remaining: 13.9s
182:	learn: 0.0087861	total: 23.5s	remaining: 13.8s
183:	learn: 0.0087223	total: 23.7s	remaining: 13.6s
184:	learn: 0.0086538	total: 23.7s	remaining: 13.5s
185:	learn: 0.0085959	total: 23.8s	remaining: 13.3s
186:	learn: 0.0085449	total: 23.9s	remaining: 13.2s
187:	learn: 0.0084985	total: 24s	remaining: 13s
188:	learn: 0.0084591	total: 24.1s	remaining: 12.9s
189:	learn: 0.0083623	total: 24.2s	remaining: 12.8s
190:	learn: 0.0083378	total: 24.3s	remaining: 12.6s
191:	learn: 0.0082729	total: 24.4s	remaining: 12.5s
192:	learn: 0.0082299	total: 24.5s	remaining: 12.3s
193:	learn: 0.0081804	total: 24.6s	remaining: 12.2s
194:	learn: 0.0081100	total: 24.7s	remaining: 12s
195:	learn: 0.0080844	total: 24.9s	remaining: 11.9s
196:	learn: 0.0080109	total: 24.9s	remaining: 11.8s
197:	learn: 0.0079630	total: 25s	remaining: 11.6s
198:	learn: 0.0078903	total: 25.1s	remaining: 11.5s
199:	learn: 0.0078380	total: 25.2s	remaining: 11.4s
200:	learn: 0.0077877	total: 25.3s	remaining: 11.2s
201:	learn: 0.0077292	total: 25.4s	remaining: 11.1s
202:	learn: 0.0076433	total: 25.5s	remaining: 10.9s
203:	learn: 0.0076317	total: 25.6s	remaining: 10.8s
204:	learn: 0.0075973	total: 25.7s	remaining: 10.7s
205:	learn: 0.0075657	total: 25.8s	remaining: 10.5s
206:	learn: 0.0075345	total: 25.9s	remaining: 10.4s
207:	learn: 0.0074684	total: 26s	remaining: 10.3s
208:	learn: 0.0074288	total: 26.1s	remaining: 10.1s
209:	learn: 0.0073853	total: 26.2s	remaining: 9.98s
210:	learn: 0.0073550	total: 26.3s	remaining: 9.85s
211:	learn: 0.0073155	total: 26.4s	remaining: 9.71s
212:	learn: 0.0072494	total: 26.5s	remaining: 9.58s
213:	learn: 0.0071923	total: 26.6s	remaining: 9.45s
214:	learn: 0.0071636	total: 26.7s	remaining: 9.31s
215:	learn: 0.0071037	total: 26.8s	remaining: 9.19s
216:	learn: 0.0070764	total: 26.9s	remaining: 9.05s
217:	learn: 0.0070284	total: 27s	remaining: 8.91s
218:	learn: 0.0069759	total: 27.1s	remaining: 8.79s
219:	learn: 0.0069138	total: 27.2s	remaining: 8.65s
220:	learn: 0.0068516	total: 27.3s	remaining: 8.52s
221:	learn: 0.0067827	total: 27.4s	remaining: 8.39s
222:	learn: 0.0067348	total: 27.5s	remaining: 8.27s
223:	learn: 0.0067038	total: 27.6s	remaining: 8.13s
224:	learn: 0.0066617	total: 27.7s	remaining: 8s
225:	learn: 0.0066399	total: 27.8s	remaining: 7.87s
226:	learn: 0.0066050	total: 27.9s	remaining: 7.73s
227:	learn: 0.0065946	total: 28s	remaining: 7.61s
228:	learn: 0.0065702	total: 28.1s	remaining: 7.48s
229:	learn: 0.0065103	total: 28.2s	remaining: 7.34s
230:	learn: 0.0064778	total: 28.3s	remaining: 7.22s
231:	learn: 0.0064317	total: 28.4s	remaining: 7.09s
232:	learn: 0.0064056	total: 28.4s	remaining: 6.96s
233:	learn: 0.0063755	total: 28.6s	remaining: 6.84s
234:	learn: 0.0063561	total: 28.6s	remaining: 6.71s
235:	learn: 0.0063093	total: 28.7s	remaining: 6.57s
236:	learn: 0.0062577	total: 28.9s	remaining: 6.45s
237:	learn: 0.0062285	total: 28.9s	remaining: 6.32s
238:	learn: 0.0061598	total: 29s	remaining: 6.2s
239:	learn: 0.0061417	total: 29.2s	remaining: 6.07s
240:	learn: 0.0061241	total: 29.3s	remaining: 5.95s
241:	learn: 0.0061000	total: 29.3s	remaining: 5.82s
242:	learn: 0.0060585	total: 29.5s	remaining: 5.71s
243:	learn: 0.0059980	total: 29.6s	remaining: 5.58s
244:	learn: 0.0059734	total: 29.7s	remaining: 5.45s
245:	learn: 0.0059566	total: 29.8s	remaining: 5.33s
246:	learn: 0.0059022	total: 29.9s	remaining: 5.21s
247:	learn: 0.0058783	total: 30s	remaining: 5.08s
248:	learn: 0.0058620	total: 30.1s	remaining: 4.96s
249:	learn: 0.0058244	total: 30.2s	remaining: 4.83s
250:	learn: 0.0057817	total: 30.3s	remaining: 4.71s
251:	learn: 0.0057376	total: 30.4s	remaining: 4.59s
252:	learn: 0.0056974	total: 30.5s	remaining: 4.46s
253:	learn: 0.0056856	total: 30.6s	remaining: 4.34s
254:	learn: 0.0056641	total: 30.7s	remaining: 4.21s
255:	learn: 0.0056431	total: 30.8s	remaining: 4.09s
256:	learn: 0.0056115	total: 30.9s	remaining: 3.97s
257:	learn: 0.0055873	total: 31s	remaining: 3.85s
258:	learn: 0.0055539	total: 31.1s	remaining: 3.73s
259:	learn: 0.0054902	total: 31.2s	remaining: 3.6s
260:	learn: 0.0054502	total: 31.3s	remaining: 3.48s
261:	learn: 0.0054310	total: 31.4s	remaining: 3.36s
262:	learn: 0.0054094	total: 31.5s	remaining: 3.23s
263:	learn: 0.0053758	total: 31.6s	remaining: 3.11s
264:	learn: 0.0053663	total: 31.7s	remaining: 2.99s
265:	learn: 0.0053081	total: 31.8s	remaining: 2.87s
266:	learn: 0.0052911	total: 31.9s	remaining: 2.75s
267:	learn: 0.0052357	total: 32s	remaining: 2.63s
268:	learn: 0.0052099	total: 32.1s	remaining: 2.5s
269:	learn: 0.0051946	total: 32.2s	remaining: 2.38s
270:	learn: 0.0051494	total: 32.3s	remaining: 2.26s
271:	learn: 0.0051494	total: 32.4s	remaining: 2.14s
272:	learn: 0.0051282	total: 32.4s	remaining: 2.02s
273:	learn: 0.0051147	total: 32.5s	remaining: 1.9s
274:	learn: 0.0050395	total: 32.6s	remaining: 1.78s
275:	learn: 0.0050276	total: 32.7s	remaining: 1.66s
276:	learn: 0.0049832	total: 32.9s	remaining: 1.54s
277:	learn: 0.0049424	total: 32.9s	remaining: 1.42s
278:	learn: 0.0049200	total: 33s	remaining: 1.3s
279:	learn: 0.0048977	total: 33.1s	remaining: 1.18s
280:	learn: 0.0048779	total: 33.2s	remaining: 1.06s
281:	learn: 0.0048779	total: 33.3s	remaining: 944ms
282:	learn: 0.0048426	total: 33.4s	remaining: 826ms
283:	learn: 0.0048084	total: 33.5s	remaining: 708ms
284:	learn: 0.0047762	total: 33.6s	remaining: 589ms
285:	learn: 0.0047483	total: 33.8s	remaining: 472ms
286:	learn: 0.0047158	total: 33.9s	remaining: 355ms
287:	learn: 0.0046965	total: 34.1s	remaining: 237ms
288:	learn: 0.0046950	total: 34.2s	remaining: 118ms
289:	learn: 0.0046769	total: 34.4s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.61
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.80
 - F1-Score_Train: 99.80
 - Precision_Test: 25.40
 - Recall_Test: 88.10
 - AUPRC_Test: 76.83
 - Accuracy_Test: 99.54
 - F1-Score_Test: 39.43
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 290
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.57
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5921462	total: 175ms	remaining: 50.7s
1:	learn: 0.4935738	total: 352ms	remaining: 50.7s
2:	learn: 0.4124219	total: 539ms	remaining: 51.6s
3:	learn: 0.3708937	total: 702ms	remaining: 50.2s
4:	learn: 0.3201313	total: 873ms	remaining: 49.7s
5:	learn: 0.2830406	total: 1.04s	remaining: 49.2s
6:	learn: 0.2470148	total: 1.23s	remaining: 49.8s
7:	learn: 0.2201560	total: 1.38s	remaining: 48.8s
8:	learn: 0.2001463	total: 1.56s	remaining: 48.7s
9:	learn: 0.1820851	total: 1.67s	remaining: 46.8s
10:	learn: 0.1670929	total: 1.76s	remaining: 44.6s
11:	learn: 0.1554975	total: 1.87s	remaining: 43.4s
12:	learn: 0.1427949	total: 1.96s	remaining: 41.7s
13:	learn: 0.1342586	total: 2.04s	remaining: 40.1s
14:	learn: 0.1256349	total: 2.15s	remaining: 39.4s
15:	learn: 0.1174302	total: 2.25s	remaining: 38.5s
16:	learn: 0.1093657	total: 2.33s	remaining: 37.5s
17:	learn: 0.1040719	total: 2.46s	remaining: 37.1s
18:	learn: 0.0989109	total: 2.54s	remaining: 36.2s
19:	learn: 0.0941842	total: 2.63s	remaining: 35.5s
20:	learn: 0.0890804	total: 2.75s	remaining: 35.2s
21:	learn: 0.0849952	total: 2.84s	remaining: 34.6s
22:	learn: 0.0825164	total: 2.92s	remaining: 33.9s
23:	learn: 0.0801061	total: 3.04s	remaining: 33.7s
24:	learn: 0.0778080	total: 3.12s	remaining: 33s
25:	learn: 0.0745031	total: 3.21s	remaining: 32.6s
26:	learn: 0.0709248	total: 3.36s	remaining: 32.8s
27:	learn: 0.0683564	total: 3.47s	remaining: 32.5s
28:	learn: 0.0660854	total: 3.55s	remaining: 32s
29:	learn: 0.0632611	total: 3.71s	remaining: 32.1s
30:	learn: 0.0618446	total: 3.79s	remaining: 31.7s
31:	learn: 0.0600368	total: 3.88s	remaining: 31.3s
32:	learn: 0.0579380	total: 4s	remaining: 31.2s
33:	learn: 0.0562920	total: 4.09s	remaining: 30.8s
34:	learn: 0.0547300	total: 4.18s	remaining: 30.5s
35:	learn: 0.0532395	total: 4.32s	remaining: 30.5s
36:	learn: 0.0517378	total: 4.42s	remaining: 30.3s
37:	learn: 0.0507919	total: 4.51s	remaining: 29.9s
38:	learn: 0.0494574	total: 4.63s	remaining: 29.8s
39:	learn: 0.0485370	total: 4.72s	remaining: 29.5s
40:	learn: 0.0470945	total: 4.81s	remaining: 29.2s
41:	learn: 0.0465383	total: 4.92s	remaining: 29.1s
42:	learn: 0.0454335	total: 5.02s	remaining: 28.8s
43:	learn: 0.0444279	total: 5.11s	remaining: 28.6s
44:	learn: 0.0436404	total: 5.22s	remaining: 28.4s
45:	learn: 0.0425908	total: 5.32s	remaining: 28.2s
46:	learn: 0.0418197	total: 5.42s	remaining: 28s
47:	learn: 0.0410174	total: 5.54s	remaining: 27.9s
48:	learn: 0.0401312	total: 5.63s	remaining: 27.7s
49:	learn: 0.0396071	total: 5.72s	remaining: 27.4s
50:	learn: 0.0388191	total: 5.84s	remaining: 27.4s
51:	learn: 0.0377934	total: 5.93s	remaining: 27.2s
52:	learn: 0.0370071	total: 6.02s	remaining: 26.9s
53:	learn: 0.0364387	total: 6.14s	remaining: 26.8s
54:	learn: 0.0359072	total: 6.22s	remaining: 26.6s
55:	learn: 0.0353165	total: 6.31s	remaining: 26.4s
56:	learn: 0.0345544	total: 6.46s	remaining: 26.4s
57:	learn: 0.0338501	total: 6.54s	remaining: 26.1s
58:	learn: 0.0331859	total: 6.64s	remaining: 26s
59:	learn: 0.0323083	total: 6.76s	remaining: 25.9s
60:	learn: 0.0317385	total: 6.85s	remaining: 25.7s
61:	learn: 0.0311831	total: 6.95s	remaining: 25.5s
62:	learn: 0.0304731	total: 7.08s	remaining: 25.5s
63:	learn: 0.0298793	total: 7.18s	remaining: 25.3s
64:	learn: 0.0294108	total: 7.26s	remaining: 25.2s
65:	learn: 0.0289573	total: 7.4s	remaining: 25.1s
66:	learn: 0.0286125	total: 7.5s	remaining: 25s
67:	learn: 0.0280226	total: 7.6s	remaining: 24.8s
68:	learn: 0.0275911	total: 7.71s	remaining: 24.7s
69:	learn: 0.0272343	total: 7.81s	remaining: 24.5s
70:	learn: 0.0266998	total: 7.91s	remaining: 24.4s
71:	learn: 0.0261808	total: 8.04s	remaining: 24.3s
72:	learn: 0.0257665	total: 8.13s	remaining: 24.2s
73:	learn: 0.0252387	total: 8.22s	remaining: 24s
74:	learn: 0.0248967	total: 8.34s	remaining: 23.9s
75:	learn: 0.0246641	total: 8.44s	remaining: 23.8s
76:	learn: 0.0242410	total: 8.55s	remaining: 23.6s
77:	learn: 0.0239067	total: 8.71s	remaining: 23.7s
78:	learn: 0.0234826	total: 8.82s	remaining: 23.6s
79:	learn: 0.0231053	total: 8.92s	remaining: 23.4s
80:	learn: 0.0226927	total: 9.01s	remaining: 23.2s
81:	learn: 0.0224933	total: 9.13s	remaining: 23.1s
82:	learn: 0.0222141	total: 9.21s	remaining: 23s
83:	learn: 0.0220530	total: 9.29s	remaining: 22.8s
84:	learn: 0.0217126	total: 9.43s	remaining: 22.8s
85:	learn: 0.0213599	total: 9.54s	remaining: 22.6s
86:	learn: 0.0209101	total: 9.63s	remaining: 22.5s
87:	learn: 0.0206713	total: 9.75s	remaining: 22.4s
88:	learn: 0.0204552	total: 9.83s	remaining: 22.2s
89:	learn: 0.0202800	total: 9.92s	remaining: 22s
90:	learn: 0.0199794	total: 10s	remaining: 22s
91:	learn: 0.0196612	total: 10.1s	remaining: 21.8s
92:	learn: 0.0193516	total: 10.2s	remaining: 21.7s
93:	learn: 0.0190575	total: 10.4s	remaining: 21.6s
94:	learn: 0.0188517	total: 10.4s	remaining: 21.4s
95:	learn: 0.0185930	total: 10.6s	remaining: 21.3s
96:	learn: 0.0183469	total: 10.7s	remaining: 21.2s
97:	learn: 0.0181184	total: 10.8s	remaining: 21.1s
98:	learn: 0.0179762	total: 10.8s	remaining: 20.9s
99:	learn: 0.0176871	total: 11s	remaining: 20.8s
100:	learn: 0.0174358	total: 11.1s	remaining: 20.7s
101:	learn: 0.0173197	total: 11.1s	remaining: 20.5s
102:	learn: 0.0171012	total: 11.3s	remaining: 20.4s
103:	learn: 0.0169115	total: 11.3s	remaining: 20.3s
104:	learn: 0.0166935	total: 11.4s	remaining: 20.2s
105:	learn: 0.0164149	total: 11.7s	remaining: 20.2s
106:	learn: 0.0161590	total: 11.8s	remaining: 20.2s
107:	learn: 0.0160529	total: 12s	remaining: 20.2s
108:	learn: 0.0158769	total: 12.1s	remaining: 20.2s
109:	learn: 0.0157662	total: 12.4s	remaining: 20.2s
110:	learn: 0.0156762	total: 12.5s	remaining: 20.2s
111:	learn: 0.0153874	total: 12.7s	remaining: 20.2s
112:	learn: 0.0151535	total: 12.9s	remaining: 20.2s
113:	learn: 0.0148973	total: 13.1s	remaining: 20.2s
114:	learn: 0.0148240	total: 13.3s	remaining: 20.2s
115:	learn: 0.0146442	total: 13.5s	remaining: 20.2s
116:	learn: 0.0144265	total: 13.7s	remaining: 20.2s
117:	learn: 0.0143425	total: 13.8s	remaining: 20.1s
118:	learn: 0.0142673	total: 14s	remaining: 20.1s
119:	learn: 0.0140858	total: 14.2s	remaining: 20.1s
120:	learn: 0.0139621	total: 14.3s	remaining: 20s
121:	learn: 0.0138504	total: 14.5s	remaining: 20s
122:	learn: 0.0136976	total: 14.7s	remaining: 20s
123:	learn: 0.0135657	total: 14.9s	remaining: 19.9s
124:	learn: 0.0134876	total: 15.1s	remaining: 19.9s
125:	learn: 0.0133409	total: 15.3s	remaining: 19.9s
126:	learn: 0.0131286	total: 15.5s	remaining: 19.8s
127:	learn: 0.0129681	total: 15.6s	remaining: 19.8s
128:	learn: 0.0128067	total: 15.8s	remaining: 19.7s
129:	learn: 0.0126869	total: 16s	remaining: 19.7s
130:	learn: 0.0125108	total: 16.1s	remaining: 19.6s
131:	learn: 0.0123754	total: 16.3s	remaining: 19.5s
132:	learn: 0.0122828	total: 16.5s	remaining: 19.5s
133:	learn: 0.0121247	total: 16.7s	remaining: 19.4s
134:	learn: 0.0119543	total: 16.8s	remaining: 19.3s
135:	learn: 0.0117946	total: 17s	remaining: 19.3s
136:	learn: 0.0116974	total: 17.2s	remaining: 19.2s
137:	learn: 0.0115985	total: 17.4s	remaining: 19.2s
138:	learn: 0.0115209	total: 17.6s	remaining: 19.1s
139:	learn: 0.0113844	total: 17.7s	remaining: 19s
140:	learn: 0.0112931	total: 17.9s	remaining: 18.9s
141:	learn: 0.0111725	total: 18s	remaining: 18.7s
142:	learn: 0.0110502	total: 18.1s	remaining: 18.6s
143:	learn: 0.0109491	total: 18.2s	remaining: 18.4s
144:	learn: 0.0107652	total: 18.3s	remaining: 18.3s
145:	learn: 0.0107071	total: 18.4s	remaining: 18.2s
146:	learn: 0.0106363	total: 18.5s	remaining: 18s
147:	learn: 0.0104860	total: 18.6s	remaining: 17.9s
148:	learn: 0.0104104	total: 18.7s	remaining: 17.7s
149:	learn: 0.0102979	total: 18.8s	remaining: 17.6s
150:	learn: 0.0102218	total: 18.9s	remaining: 17.4s
151:	learn: 0.0100690	total: 19s	remaining: 17.3s
152:	learn: 0.0099809	total: 19.2s	remaining: 17.2s
153:	learn: 0.0098772	total: 19.3s	remaining: 17s
154:	learn: 0.0097778	total: 19.4s	remaining: 16.9s
155:	learn: 0.0097041	total: 19.5s	remaining: 16.7s
156:	learn: 0.0096468	total: 19.6s	remaining: 16.6s
157:	learn: 0.0095482	total: 19.7s	remaining: 16.4s
158:	learn: 0.0094490	total: 19.8s	remaining: 16.3s
159:	learn: 0.0093746	total: 19.9s	remaining: 16.2s
160:	learn: 0.0092533	total: 20s	remaining: 16s
161:	learn: 0.0091960	total: 20.1s	remaining: 15.9s
162:	learn: 0.0090757	total: 20.2s	remaining: 15.7s
163:	learn: 0.0089512	total: 20.3s	remaining: 15.6s
164:	learn: 0.0089012	total: 20.4s	remaining: 15.5s
165:	learn: 0.0088071	total: 20.5s	remaining: 15.3s
166:	learn: 0.0086737	total: 20.6s	remaining: 15.2s
167:	learn: 0.0085674	total: 20.7s	remaining: 15.1s
168:	learn: 0.0084859	total: 20.8s	remaining: 14.9s
169:	learn: 0.0084360	total: 20.9s	remaining: 14.8s
170:	learn: 0.0084045	total: 21s	remaining: 14.6s
171:	learn: 0.0082460	total: 21.1s	remaining: 14.5s
172:	learn: 0.0081380	total: 21.2s	remaining: 14.4s
173:	learn: 0.0080836	total: 21.3s	remaining: 14.2s
174:	learn: 0.0080346	total: 21.4s	remaining: 14.1s
175:	learn: 0.0079508	total: 21.5s	remaining: 14s
176:	learn: 0.0078340	total: 21.6s	remaining: 13.8s
177:	learn: 0.0077969	total: 21.8s	remaining: 13.7s
178:	learn: 0.0077051	total: 21.8s	remaining: 13.5s
179:	learn: 0.0076621	total: 21.9s	remaining: 13.4s
180:	learn: 0.0075803	total: 22s	remaining: 13.3s
181:	learn: 0.0075235	total: 22.1s	remaining: 13.1s
182:	learn: 0.0074908	total: 22.3s	remaining: 13s
183:	learn: 0.0074285	total: 22.4s	remaining: 12.9s
184:	learn: 0.0073683	total: 22.5s	remaining: 12.8s
185:	learn: 0.0072981	total: 22.6s	remaining: 12.6s
186:	learn: 0.0072661	total: 22.7s	remaining: 12.5s
187:	learn: 0.0072002	total: 22.8s	remaining: 12.4s
188:	learn: 0.0071572	total: 22.9s	remaining: 12.2s
189:	learn: 0.0071046	total: 23s	remaining: 12.1s
190:	learn: 0.0070005	total: 23.1s	remaining: 12s
191:	learn: 0.0069671	total: 23.2s	remaining: 11.8s
192:	learn: 0.0069142	total: 23.3s	remaining: 11.7s
193:	learn: 0.0068848	total: 23.4s	remaining: 11.6s
194:	learn: 0.0068685	total: 23.5s	remaining: 11.5s
195:	learn: 0.0068192	total: 23.6s	remaining: 11.3s
196:	learn: 0.0067959	total: 23.7s	remaining: 11.2s
197:	learn: 0.0067620	total: 23.8s	remaining: 11.1s
198:	learn: 0.0067089	total: 23.9s	remaining: 10.9s
199:	learn: 0.0066567	total: 24s	remaining: 10.8s
200:	learn: 0.0065796	total: 24.1s	remaining: 10.7s
201:	learn: 0.0065171	total: 24.2s	remaining: 10.6s
202:	learn: 0.0064782	total: 24.4s	remaining: 10.4s
203:	learn: 0.0064316	total: 24.4s	remaining: 10.3s
204:	learn: 0.0063698	total: 24.5s	remaining: 10.2s
205:	learn: 0.0063416	total: 24.7s	remaining: 10.1s
206:	learn: 0.0062984	total: 24.7s	remaining: 9.92s
207:	learn: 0.0062560	total: 24.8s	remaining: 9.79s
208:	learn: 0.0061961	total: 24.9s	remaining: 9.67s
209:	learn: 0.0061758	total: 25s	remaining: 9.53s
210:	learn: 0.0061577	total: 25.1s	remaining: 9.4s
211:	learn: 0.0061102	total: 25.2s	remaining: 9.28s
212:	learn: 0.0060536	total: 25.3s	remaining: 9.15s
213:	learn: 0.0060405	total: 25.4s	remaining: 9.03s
214:	learn: 0.0060141	total: 25.5s	remaining: 8.9s
215:	learn: 0.0059721	total: 25.6s	remaining: 8.78s
216:	learn: 0.0059209	total: 25.7s	remaining: 8.65s
217:	learn: 0.0058936	total: 25.8s	remaining: 8.53s
218:	learn: 0.0058548	total: 25.9s	remaining: 8.4s
219:	learn: 0.0058108	total: 26s	remaining: 8.27s
220:	learn: 0.0057738	total: 26.1s	remaining: 8.15s
221:	learn: 0.0057467	total: 26.2s	remaining: 8.03s
222:	learn: 0.0057064	total: 26.3s	remaining: 7.9s
223:	learn: 0.0056748	total: 26.4s	remaining: 7.79s
224:	learn: 0.0056327	total: 26.5s	remaining: 7.67s
225:	learn: 0.0055951	total: 26.6s	remaining: 7.54s
226:	learn: 0.0055511	total: 26.7s	remaining: 7.42s
227:	learn: 0.0055098	total: 26.8s	remaining: 7.3s
228:	learn: 0.0054564	total: 26.9s	remaining: 7.17s
229:	learn: 0.0054261	total: 27s	remaining: 7.05s
230:	learn: 0.0053867	total: 27.1s	remaining: 6.93s
231:	learn: 0.0053154	total: 27.2s	remaining: 6.8s
232:	learn: 0.0052824	total: 27.3s	remaining: 6.69s
233:	learn: 0.0052634	total: 27.4s	remaining: 6.57s
234:	learn: 0.0052229	total: 27.5s	remaining: 6.44s
235:	learn: 0.0052229	total: 27.6s	remaining: 6.32s
236:	learn: 0.0051934	total: 27.7s	remaining: 6.2s
237:	learn: 0.0051518	total: 27.8s	remaining: 6.08s
238:	learn: 0.0051298	total: 28s	remaining: 5.97s
239:	learn: 0.0050700	total: 28.1s	remaining: 5.86s
240:	learn: 0.0050407	total: 28.3s	remaining: 5.75s
241:	learn: 0.0049967	total: 28.5s	remaining: 5.65s
242:	learn: 0.0049608	total: 28.7s	remaining: 5.54s
243:	learn: 0.0049192	total: 28.8s	remaining: 5.44s
244:	learn: 0.0048804	total: 29s	remaining: 5.33s
245:	learn: 0.0048617	total: 29.2s	remaining: 5.22s
246:	learn: 0.0048399	total: 29.4s	remaining: 5.11s
247:	learn: 0.0048399	total: 29.5s	remaining: 5s
248:	learn: 0.0048108	total: 29.7s	remaining: 4.89s
249:	learn: 0.0047929	total: 29.8s	remaining: 4.78s
250:	learn: 0.0047765	total: 30s	remaining: 4.66s
251:	learn: 0.0047276	total: 30.2s	remaining: 4.55s
252:	learn: 0.0046913	total: 30.4s	remaining: 4.44s
253:	learn: 0.0046330	total: 30.6s	remaining: 4.33s
254:	learn: 0.0046032	total: 30.8s	remaining: 4.22s
255:	learn: 0.0045905	total: 30.9s	remaining: 4.11s
256:	learn: 0.0045640	total: 31.1s	remaining: 3.99s
257:	learn: 0.0045438	total: 31.3s	remaining: 3.88s
258:	learn: 0.0045304	total: 31.5s	remaining: 3.76s
259:	learn: 0.0044950	total: 31.6s	remaining: 3.65s
260:	learn: 0.0044686	total: 31.8s	remaining: 3.54s
261:	learn: 0.0044611	total: 32s	remaining: 3.42s
262:	learn: 0.0044124	total: 32.2s	remaining: 3.3s
263:	learn: 0.0043717	total: 32.3s	remaining: 3.18s
264:	learn: 0.0043717	total: 32.5s	remaining: 3.06s
265:	learn: 0.0043510	total: 32.6s	remaining: 2.94s
266:	learn: 0.0043413	total: 32.8s	remaining: 2.83s
267:	learn: 0.0043333	total: 33s	remaining: 2.71s
268:	learn: 0.0043155	total: 33.1s	remaining: 2.59s
269:	learn: 0.0042954	total: 33.3s	remaining: 2.47s
270:	learn: 0.0042954	total: 33.4s	remaining: 2.35s
271:	learn: 0.0042753	total: 33.6s	remaining: 2.22s
272:	learn: 0.0042524	total: 33.8s	remaining: 2.1s
273:	learn: 0.0042385	total: 34s	remaining: 1.98s
274:	learn: 0.0042219	total: 34.1s	remaining: 1.86s
275:	learn: 0.0042136	total: 34.2s	remaining: 1.74s
276:	learn: 0.0041740	total: 34.3s	remaining: 1.61s
277:	learn: 0.0041551	total: 34.5s	remaining: 1.49s
278:	learn: 0.0041396	total: 34.6s	remaining: 1.36s
279:	learn: 0.0041396	total: 34.7s	remaining: 1.24s
280:	learn: 0.0041261	total: 34.8s	remaining: 1.11s
281:	learn: 0.0040901	total: 34.9s	remaining: 989ms
282:	learn: 0.0040767	total: 34.9s	remaining: 864ms
283:	learn: 0.0040652	total: 35s	remaining: 740ms
284:	learn: 0.0040406	total: 35.1s	remaining: 616ms
285:	learn: 0.0040337	total: 35.2s	remaining: 492ms
286:	learn: 0.0040160	total: 35.3s	remaining: 369ms
287:	learn: 0.0040159	total: 35.4s	remaining: 246ms
288:	learn: 0.0040159	total: 35.5s	remaining: 123ms
289:	learn: 0.0040037	total: 35.6s	remaining: 0us
[I 2024-12-19 15:12:44,928] Trial 47 finished with value: 77.65157992289289 and parameters: {'learning_rate': 0.043583875498658255, 'max_depth': 6, 'n_estimators': 290, 'scale_pos_weight': 5.571176876101519}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 26.73
 - Recall_Test: 85.71
 - AUPRC_Test: 76.36
 - Accuracy_Test: 99.58
 - F1-Score_Test: 40.75
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 290
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.04
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 5.57
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 77.6516

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4918000	total: 105ms	remaining: 26.9s
1:	learn: 0.3359094	total: 194ms	remaining: 24.8s
2:	learn: 0.2470929	total: 285ms	remaining: 24.2s
3:	learn: 0.1816849	total: 416ms	remaining: 26.4s
4:	learn: 0.1410022	total: 507ms	remaining: 25.7s
5:	learn: 0.1098187	total: 598ms	remaining: 25.1s
6:	learn: 0.0895788	total: 711ms	remaining: 25.5s
7:	learn: 0.0766646	total: 803ms	remaining: 25.1s
8:	learn: 0.0672493	total: 901ms	remaining: 24.9s
9:	learn: 0.0614482	total: 1.02s	remaining: 25.4s
10:	learn: 0.0553491	total: 1.13s	remaining: 25.4s
11:	learn: 0.0513468	total: 1.21s	remaining: 24.8s
12:	learn: 0.0482138	total: 1.32s	remaining: 24.8s
13:	learn: 0.0455118	total: 1.41s	remaining: 24.6s
14:	learn: 0.0425894	total: 1.5s	remaining: 24.2s
15:	learn: 0.0398064	total: 1.61s	remaining: 24.4s
16:	learn: 0.0377855	total: 1.71s	remaining: 24.2s
17:	learn: 0.0353066	total: 1.8s	remaining: 24s
18:	learn: 0.0341154	total: 1.92s	remaining: 24.1s
19:	learn: 0.0322770	total: 2.01s	remaining: 23.9s
20:	learn: 0.0311093	total: 2.12s	remaining: 23.9s
21:	learn: 0.0298311	total: 2.23s	remaining: 23.9s
22:	learn: 0.0283601	total: 2.32s	remaining: 23.7s
23:	learn: 0.0275786	total: 2.41s	remaining: 23.5s
24:	learn: 0.0261792	total: 2.53s	remaining: 23.6s
25:	learn: 0.0253618	total: 2.62s	remaining: 23.4s
26:	learn: 0.0240463	total: 2.71s	remaining: 23.2s
27:	learn: 0.0233295	total: 2.85s	remaining: 23.4s
28:	learn: 0.0224253	total: 2.94s	remaining: 23.3s
29:	learn: 0.0213635	total: 3.04s	remaining: 23.1s
30:	learn: 0.0206747	total: 3.16s	remaining: 23.1s
31:	learn: 0.0199589	total: 3.25s	remaining: 22.9s
32:	learn: 0.0192656	total: 3.34s	remaining: 22.7s
33:	learn: 0.0185853	total: 3.46s	remaining: 22.8s
34:	learn: 0.0180299	total: 3.54s	remaining: 22.5s
35:	learn: 0.0174066	total: 3.62s	remaining: 22.3s
36:	learn: 0.0169630	total: 3.73s	remaining: 22.3s
37:	learn: 0.0164423	total: 3.82s	remaining: 22.1s
38:	learn: 0.0159144	total: 3.9s	remaining: 21.9s
39:	learn: 0.0154798	total: 4.03s	remaining: 21.9s
40:	learn: 0.0150226	total: 4.12s	remaining: 21.8s
41:	learn: 0.0146309	total: 4.21s	remaining: 21.7s
42:	learn: 0.0140459	total: 4.33s	remaining: 21.7s
43:	learn: 0.0136632	total: 4.44s	remaining: 21.6s
44:	learn: 0.0132782	total: 4.52s	remaining: 21.4s
45:	learn: 0.0129419	total: 4.64s	remaining: 21.4s
46:	learn: 0.0127262	total: 4.73s	remaining: 21.3s
47:	learn: 0.0123625	total: 4.83s	remaining: 21.1s
48:	learn: 0.0121743	total: 4.94s	remaining: 21.1s
49:	learn: 0.0118196	total: 5.03s	remaining: 20.9s
50:	learn: 0.0114602	total: 5.12s	remaining: 20.8s
51:	learn: 0.0112617	total: 5.24s	remaining: 20.8s
52:	learn: 0.0110734	total: 5.33s	remaining: 20.6s
53:	learn: 0.0108558	total: 5.42s	remaining: 20.5s
54:	learn: 0.0106012	total: 5.54s	remaining: 20.4s
55:	learn: 0.0103850	total: 5.63s	remaining: 20.3s
56:	learn: 0.0101604	total: 5.71s	remaining: 20.1s
57:	learn: 0.0098817	total: 5.83s	remaining: 20.1s
58:	learn: 0.0097095	total: 5.91s	remaining: 19.9s
59:	learn: 0.0095637	total: 6s	remaining: 19.8s
60:	learn: 0.0093411	total: 6.13s	remaining: 19.8s
61:	learn: 0.0091636	total: 6.21s	remaining: 19.6s
62:	learn: 0.0090410	total: 6.35s	remaining: 19.6s
63:	learn: 0.0088878	total: 6.5s	remaining: 19.7s
64:	learn: 0.0087281	total: 6.67s	remaining: 19.8s
65:	learn: 0.0085636	total: 6.85s	remaining: 19.9s
66:	learn: 0.0084685	total: 7.01s	remaining: 20s
67:	learn: 0.0082964	total: 7.18s	remaining: 20.1s
68:	learn: 0.0081635	total: 7.38s	remaining: 20.2s
69:	learn: 0.0079571	total: 7.56s	remaining: 20.3s
70:	learn: 0.0077946	total: 7.74s	remaining: 20.4s
71:	learn: 0.0076414	total: 7.91s	remaining: 20.4s
72:	learn: 0.0074958	total: 8.08s	remaining: 20.5s
73:	learn: 0.0073325	total: 8.25s	remaining: 20.5s
74:	learn: 0.0072038	total: 8.44s	remaining: 20.6s
75:	learn: 0.0071317	total: 8.6s	remaining: 20.6s
76:	learn: 0.0068766	total: 8.79s	remaining: 20.7s
77:	learn: 0.0068076	total: 8.95s	remaining: 20.7s
78:	learn: 0.0067039	total: 9.14s	remaining: 20.7s
79:	learn: 0.0066079	total: 9.3s	remaining: 20.7s
80:	learn: 0.0064807	total: 9.51s	remaining: 20.8s
81:	learn: 0.0063376	total: 9.68s	remaining: 20.8s
82:	learn: 0.0062551	total: 9.85s	remaining: 20.8s
83:	learn: 0.0061374	total: 10s	remaining: 20.7s
84:	learn: 0.0060664	total: 10.2s	remaining: 20.7s
85:	learn: 0.0059515	total: 10.4s	remaining: 20.7s
86:	learn: 0.0058699	total: 10.6s	remaining: 20.7s
87:	learn: 0.0057119	total: 10.7s	remaining: 20.7s
88:	learn: 0.0055540	total: 10.9s	remaining: 20.7s
89:	learn: 0.0054466	total: 11.1s	remaining: 20.7s
90:	learn: 0.0054032	total: 11.2s	remaining: 20.6s
91:	learn: 0.0053476	total: 11.4s	remaining: 20.6s
92:	learn: 0.0053024	total: 11.6s	remaining: 20.6s
93:	learn: 0.0052390	total: 11.8s	remaining: 20.6s
94:	learn: 0.0050934	total: 11.9s	remaining: 20.4s
95:	learn: 0.0050355	total: 12s	remaining: 20.2s
96:	learn: 0.0049806	total: 12.1s	remaining: 20.1s
97:	learn: 0.0049210	total: 12.2s	remaining: 19.9s
98:	learn: 0.0048060	total: 12.3s	remaining: 19.7s
99:	learn: 0.0047387	total: 12.4s	remaining: 19.5s
100:	learn: 0.0046611	total: 12.5s	remaining: 19.4s
101:	learn: 0.0045951	total: 12.6s	remaining: 19.2s
102:	learn: 0.0045536	total: 12.7s	remaining: 19.1s
103:	learn: 0.0045016	total: 12.8s	remaining: 18.9s
104:	learn: 0.0044751	total: 12.9s	remaining: 18.7s
105:	learn: 0.0044530	total: 12.9s	remaining: 18.6s
106:	learn: 0.0043984	total: 13.1s	remaining: 18.4s
107:	learn: 0.0043324	total: 13.1s	remaining: 18.2s
108:	learn: 0.0043092	total: 13.2s	remaining: 18.1s
109:	learn: 0.0043092	total: 13.3s	remaining: 17.9s
110:	learn: 0.0042669	total: 13.4s	remaining: 17.7s
111:	learn: 0.0042511	total: 13.5s	remaining: 17.6s
112:	learn: 0.0042511	total: 13.6s	remaining: 17.4s
113:	learn: 0.0042425	total: 13.7s	remaining: 17.3s
114:	learn: 0.0041251	total: 13.8s	remaining: 17.1s
115:	learn: 0.0041060	total: 13.9s	remaining: 17s
116:	learn: 0.0040898	total: 14s	remaining: 16.8s
117:	learn: 0.0040645	total: 14s	remaining: 16.7s
118:	learn: 0.0040344	total: 14.2s	remaining: 16.5s
119:	learn: 0.0039721	total: 14.2s	remaining: 16.4s
120:	learn: 0.0039416	total: 14.3s	remaining: 16.2s
121:	learn: 0.0038896	total: 14.5s	remaining: 16.1s
122:	learn: 0.0038184	total: 14.6s	remaining: 16s
123:	learn: 0.0037923	total: 14.6s	remaining: 15.8s
124:	learn: 0.0036777	total: 14.8s	remaining: 15.7s
125:	learn: 0.0036436	total: 14.8s	remaining: 15.5s
126:	learn: 0.0036436	total: 14.9s	remaining: 15.4s
127:	learn: 0.0035768	total: 15s	remaining: 15.3s
128:	learn: 0.0035370	total: 15.1s	remaining: 15.1s
129:	learn: 0.0034836	total: 15.2s	remaining: 15s
130:	learn: 0.0034726	total: 15.3s	remaining: 14.8s
131:	learn: 0.0034105	total: 15.4s	remaining: 14.7s
132:	learn: 0.0033342	total: 15.5s	remaining: 14.6s
133:	learn: 0.0033342	total: 15.6s	remaining: 14.4s
134:	learn: 0.0033166	total: 15.7s	remaining: 14.3s
135:	learn: 0.0033166	total: 15.7s	remaining: 14.1s
136:	learn: 0.0033166	total: 15.8s	remaining: 14s
137:	learn: 0.0033032	total: 15.9s	remaining: 13.8s
138:	learn: 0.0032656	total: 16s	remaining: 13.7s
139:	learn: 0.0032261	total: 16.1s	remaining: 13.6s
140:	learn: 0.0032260	total: 16.2s	remaining: 13.4s
141:	learn: 0.0032138	total: 16.3s	remaining: 13.3s
142:	learn: 0.0032137	total: 16.4s	remaining: 13.2s
143:	learn: 0.0032137	total: 16.4s	remaining: 13s
144:	learn: 0.0032137	total: 16.5s	remaining: 12.9s
145:	learn: 0.0031527	total: 16.6s	remaining: 12.7s
146:	learn: 0.0031048	total: 16.7s	remaining: 12.6s
147:	learn: 0.0031048	total: 16.8s	remaining: 12.5s
148:	learn: 0.0031048	total: 16.9s	remaining: 12.3s
149:	learn: 0.0031048	total: 16.9s	remaining: 12.2s
150:	learn: 0.0031048	total: 17s	remaining: 12.1s
151:	learn: 0.0031048	total: 17.1s	remaining: 11.9s
152:	learn: 0.0031048	total: 17.2s	remaining: 11.8s
153:	learn: 0.0031048	total: 17.2s	remaining: 11.6s
154:	learn: 0.0031048	total: 17.3s	remaining: 11.5s
155:	learn: 0.0031047	total: 17.4s	remaining: 11.4s
156:	learn: 0.0031047	total: 17.5s	remaining: 11.2s
157:	learn: 0.0030851	total: 17.6s	remaining: 11.1s
158:	learn: 0.0030851	total: 17.6s	remaining: 11s
159:	learn: 0.0030851	total: 17.7s	remaining: 10.9s
160:	learn: 0.0030851	total: 17.8s	remaining: 10.7s
161:	learn: 0.0030851	total: 17.9s	remaining: 10.6s
162:	learn: 0.0030851	total: 18s	remaining: 10.5s
163:	learn: 0.0030851	total: 18.1s	remaining: 10.3s
164:	learn: 0.0030851	total: 18.2s	remaining: 10.2s
165:	learn: 0.0030850	total: 18.2s	remaining: 10.1s
166:	learn: 0.0030850	total: 18.3s	remaining: 9.98s
167:	learn: 0.0030850	total: 18.4s	remaining: 9.85s
168:	learn: 0.0030850	total: 18.5s	remaining: 9.72s
169:	learn: 0.0030850	total: 18.5s	remaining: 9.59s
170:	learn: 0.0030850	total: 18.6s	remaining: 9.47s
171:	learn: 0.0030850	total: 18.7s	remaining: 9.34s
172:	learn: 0.0030850	total: 18.8s	remaining: 9.23s
173:	learn: 0.0030850	total: 18.9s	remaining: 9.11s
174:	learn: 0.0030850	total: 18.9s	remaining: 8.98s
175:	learn: 0.0030850	total: 19s	remaining: 8.85s
176:	learn: 0.0030850	total: 19.1s	remaining: 8.73s
177:	learn: 0.0030850	total: 19.2s	remaining: 8.61s
178:	learn: 0.0030850	total: 19.2s	remaining: 8.49s
179:	learn: 0.0030850	total: 19.3s	remaining: 8.37s
180:	learn: 0.0030850	total: 19.4s	remaining: 8.25s
181:	learn: 0.0030850	total: 19.5s	remaining: 8.13s
182:	learn: 0.0030850	total: 19.5s	remaining: 8.01s
183:	learn: 0.0030850	total: 19.6s	remaining: 7.88s
184:	learn: 0.0030850	total: 19.7s	remaining: 7.77s
185:	learn: 0.0030850	total: 19.8s	remaining: 7.66s
186:	learn: 0.0030850	total: 19.8s	remaining: 7.54s
187:	learn: 0.0030850	total: 19.9s	remaining: 7.42s
188:	learn: 0.0030850	total: 20s	remaining: 7.31s
189:	learn: 0.0030850	total: 20.1s	remaining: 7.19s
190:	learn: 0.0030850	total: 20.2s	remaining: 7.07s
191:	learn: 0.0030850	total: 20.2s	remaining: 6.96s
192:	learn: 0.0030850	total: 20.3s	remaining: 6.84s
193:	learn: 0.0030850	total: 20.4s	remaining: 6.73s
194:	learn: 0.0030850	total: 20.5s	remaining: 6.62s
195:	learn: 0.0030850	total: 20.5s	remaining: 6.5s
196:	learn: 0.0030850	total: 20.6s	remaining: 6.38s
197:	learn: 0.0030849	total: 20.7s	remaining: 6.27s
198:	learn: 0.0030848	total: 20.8s	remaining: 6.16s
199:	learn: 0.0030848	total: 20.9s	remaining: 6.05s
200:	learn: 0.0030848	total: 20.9s	remaining: 5.94s
201:	learn: 0.0030848	total: 21s	remaining: 5.83s
202:	learn: 0.0030848	total: 21.1s	remaining: 5.71s
203:	learn: 0.0030848	total: 21.2s	remaining: 5.6s
204:	learn: 0.0030847	total: 21.2s	remaining: 5.49s
205:	learn: 0.0030847	total: 21.3s	remaining: 5.38s
206:	learn: 0.0030847	total: 21.4s	remaining: 5.27s
207:	learn: 0.0030847	total: 21.5s	remaining: 5.16s
208:	learn: 0.0030847	total: 21.5s	remaining: 5.05s
209:	learn: 0.0030847	total: 21.6s	remaining: 4.94s
210:	learn: 0.0030847	total: 21.7s	remaining: 4.83s
211:	learn: 0.0030847	total: 21.8s	remaining: 4.72s
212:	learn: 0.0030847	total: 21.9s	remaining: 4.62s
213:	learn: 0.0030847	total: 22s	remaining: 4.53s
214:	learn: 0.0030847	total: 22.1s	remaining: 4.42s
215:	learn: 0.0030847	total: 22.3s	remaining: 4.33s
216:	learn: 0.0030847	total: 22.4s	remaining: 4.23s
217:	learn: 0.0030847	total: 22.5s	remaining: 4.13s
218:	learn: 0.0030847	total: 22.7s	remaining: 4.03s
219:	learn: 0.0030847	total: 22.8s	remaining: 3.94s
220:	learn: 0.0030847	total: 22.9s	remaining: 3.84s
221:	learn: 0.0030847	total: 23.1s	remaining: 3.74s
222:	learn: 0.0030847	total: 23.2s	remaining: 3.64s
223:	learn: 0.0030847	total: 23.4s	remaining: 3.54s
224:	learn: 0.0030847	total: 23.5s	remaining: 3.44s
225:	learn: 0.0030847	total: 23.6s	remaining: 3.34s
226:	learn: 0.0030847	total: 23.7s	remaining: 3.24s
227:	learn: 0.0030847	total: 23.9s	remaining: 3.14s
228:	learn: 0.0030847	total: 24s	remaining: 3.04s
229:	learn: 0.0030847	total: 24.2s	remaining: 2.94s
230:	learn: 0.0030847	total: 24.3s	remaining: 2.84s
231:	learn: 0.0030847	total: 24.4s	remaining: 2.74s
232:	learn: 0.0030847	total: 24.6s	remaining: 2.63s
233:	learn: 0.0030847	total: 24.7s	remaining: 2.53s
234:	learn: 0.0030847	total: 24.8s	remaining: 2.43s
235:	learn: 0.0030847	total: 25s	remaining: 2.33s
236:	learn: 0.0030847	total: 25.1s	remaining: 2.22s
237:	learn: 0.0030847	total: 25.2s	remaining: 2.12s
238:	learn: 0.0030847	total: 25.4s	remaining: 2.02s
239:	learn: 0.0030847	total: 25.5s	remaining: 1.91s
240:	learn: 0.0030847	total: 25.6s	remaining: 1.81s
241:	learn: 0.0030846	total: 25.8s	remaining: 1.7s
242:	learn: 0.0030846	total: 25.9s	remaining: 1.6s
243:	learn: 0.0030846	total: 26s	remaining: 1.49s
244:	learn: 0.0030846	total: 26.2s	remaining: 1.39s
245:	learn: 0.0030092	total: 26.3s	remaining: 1.28s
246:	learn: 0.0029864	total: 26.5s	remaining: 1.18s
247:	learn: 0.0029511	total: 26.7s	remaining: 1.08s
248:	learn: 0.0029511	total: 26.8s	remaining: 969ms
249:	learn: 0.0029382	total: 27s	remaining: 864ms
250:	learn: 0.0028786	total: 27.2s	remaining: 758ms
251:	learn: 0.0028646	total: 27.4s	remaining: 652ms
252:	learn: 0.0028318	total: 27.5s	remaining: 544ms
253:	learn: 0.0028202	total: 27.7s	remaining: 436ms
254:	learn: 0.0028202	total: 27.8s	remaining: 327ms
255:	learn: 0.0028201	total: 28s	remaining: 219ms
256:	learn: 0.0028201	total: 28.1s	remaining: 109ms
257:	learn: 0.0028201	total: 28.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.70
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 26.49
 - Recall_Test: 84.92
 - AUPRC_Test: 79.33
 - Accuracy_Test: 99.58
 - F1-Score_Test: 40.38
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 258
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.46
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.4999932	total: 88.6ms	remaining: 22.8s
1:	learn: 0.3745039	total: 174ms	remaining: 22.2s
2:	learn: 0.2759599	total: 262ms	remaining: 22.3s
3:	learn: 0.2104948	total: 377ms	remaining: 23.9s
4:	learn: 0.1765520	total: 462ms	remaining: 23.4s
5:	learn: 0.1445754	total: 555ms	remaining: 23.3s
6:	learn: 0.1267685	total: 679ms	remaining: 24.3s
7:	learn: 0.1163498	total: 766ms	remaining: 23.9s
8:	learn: 0.1042004	total: 857ms	remaining: 23.7s
9:	learn: 0.0952600	total: 988ms	remaining: 24.5s
10:	learn: 0.0874086	total: 1.08s	remaining: 24.2s
11:	learn: 0.0808326	total: 1.16s	remaining: 23.8s
12:	learn: 0.0747277	total: 1.28s	remaining: 24.1s
13:	learn: 0.0704836	total: 1.36s	remaining: 23.8s
14:	learn: 0.0659695	total: 1.46s	remaining: 23.6s
15:	learn: 0.0617326	total: 1.57s	remaining: 23.8s
16:	learn: 0.0576612	total: 1.66s	remaining: 23.6s
17:	learn: 0.0546868	total: 1.75s	remaining: 23.4s
18:	learn: 0.0515669	total: 1.87s	remaining: 23.6s
19:	learn: 0.0484954	total: 1.98s	remaining: 23.5s
20:	learn: 0.0463594	total: 2.09s	remaining: 23.6s
21:	learn: 0.0440872	total: 2.21s	remaining: 23.7s
22:	learn: 0.0414281	total: 2.3s	remaining: 23.5s
23:	learn: 0.0403244	total: 2.41s	remaining: 23.5s
24:	learn: 0.0387957	total: 2.5s	remaining: 23.3s
25:	learn: 0.0374308	total: 2.61s	remaining: 23.3s
26:	learn: 0.0358366	total: 2.7s	remaining: 23.1s
27:	learn: 0.0350166	total: 2.81s	remaining: 23.1s
28:	learn: 0.0338515	total: 2.9s	remaining: 22.9s
29:	learn: 0.0329431	total: 2.99s	remaining: 22.7s
30:	learn: 0.0320139	total: 3.12s	remaining: 22.8s
31:	learn: 0.0307786	total: 3.25s	remaining: 22.9s
32:	learn: 0.0293740	total: 3.37s	remaining: 23s
33:	learn: 0.0287321	total: 3.45s	remaining: 22.7s
34:	learn: 0.0276955	total: 3.54s	remaining: 22.6s
35:	learn: 0.0269959	total: 3.65s	remaining: 22.5s
36:	learn: 0.0263771	total: 3.74s	remaining: 22.3s
37:	learn: 0.0256912	total: 3.83s	remaining: 22.2s
38:	learn: 0.0253210	total: 3.93s	remaining: 22.1s
39:	learn: 0.0246311	total: 4.03s	remaining: 21.9s
40:	learn: 0.0239903	total: 4.12s	remaining: 21.8s
41:	learn: 0.0234558	total: 4.24s	remaining: 21.8s
42:	learn: 0.0225510	total: 4.33s	remaining: 21.7s
43:	learn: 0.0220442	total: 4.42s	remaining: 21.5s
44:	learn: 0.0215547	total: 4.53s	remaining: 21.5s
45:	learn: 0.0209404	total: 4.62s	remaining: 21.3s
46:	learn: 0.0202857	total: 4.71s	remaining: 21.1s
47:	learn: 0.0200477	total: 4.83s	remaining: 21.1s
48:	learn: 0.0193341	total: 4.92s	remaining: 21s
49:	learn: 0.0187997	total: 5.01s	remaining: 20.8s
50:	learn: 0.0183434	total: 5.13s	remaining: 20.8s
51:	learn: 0.0178618	total: 5.23s	remaining: 20.7s
52:	learn: 0.0174327	total: 5.31s	remaining: 20.5s
53:	learn: 0.0170548	total: 5.42s	remaining: 20.5s
54:	learn: 0.0167658	total: 5.51s	remaining: 20.3s
55:	learn: 0.0163448	total: 5.6s	remaining: 20.2s
56:	learn: 0.0159802	total: 5.71s	remaining: 20.1s
57:	learn: 0.0155767	total: 5.8s	remaining: 20s
58:	learn: 0.0152167	total: 5.89s	remaining: 19.9s
59:	learn: 0.0149062	total: 6.01s	remaining: 19.8s
60:	learn: 0.0146158	total: 6.09s	remaining: 19.7s
61:	learn: 0.0142645	total: 6.2s	remaining: 19.6s
62:	learn: 0.0139657	total: 6.31s	remaining: 19.5s
63:	learn: 0.0137154	total: 6.39s	remaining: 19.4s
64:	learn: 0.0133806	total: 6.48s	remaining: 19.2s
65:	learn: 0.0131847	total: 6.58s	remaining: 19.2s
66:	learn: 0.0127380	total: 6.68s	remaining: 19s
67:	learn: 0.0125813	total: 6.75s	remaining: 18.9s
68:	learn: 0.0124311	total: 6.87s	remaining: 18.8s
69:	learn: 0.0121748	total: 6.95s	remaining: 18.7s
70:	learn: 0.0120068	total: 7.04s	remaining: 18.5s
71:	learn: 0.0117511	total: 7.18s	remaining: 18.5s
72:	learn: 0.0115394	total: 7.28s	remaining: 18.4s
73:	learn: 0.0112397	total: 7.37s	remaining: 18.3s
74:	learn: 0.0109443	total: 7.49s	remaining: 18.3s
75:	learn: 0.0107280	total: 7.57s	remaining: 18.1s
76:	learn: 0.0106272	total: 7.72s	remaining: 18.1s
77:	learn: 0.0103756	total: 7.86s	remaining: 18.1s
78:	learn: 0.0102624	total: 8.04s	remaining: 18.2s
79:	learn: 0.0101232	total: 8.19s	remaining: 18.2s
80:	learn: 0.0099085	total: 8.36s	remaining: 18.3s
81:	learn: 0.0096232	total: 8.54s	remaining: 18.3s
82:	learn: 0.0094119	total: 8.72s	remaining: 18.4s
83:	learn: 0.0093017	total: 8.85s	remaining: 18.3s
84:	learn: 0.0091432	total: 9.03s	remaining: 18.4s
85:	learn: 0.0090826	total: 9.18s	remaining: 18.4s
86:	learn: 0.0089528	total: 9.38s	remaining: 18.4s
87:	learn: 0.0088112	total: 9.54s	remaining: 18.4s
88:	learn: 0.0086687	total: 9.7s	remaining: 18.4s
89:	learn: 0.0085691	total: 9.86s	remaining: 18.4s
90:	learn: 0.0084530	total: 10.1s	remaining: 18.4s
91:	learn: 0.0082998	total: 10.2s	remaining: 18.4s
92:	learn: 0.0081760	total: 10.4s	remaining: 18.4s
93:	learn: 0.0079460	total: 10.6s	remaining: 18.4s
94:	learn: 0.0078947	total: 10.7s	remaining: 18.4s
95:	learn: 0.0077675	total: 10.9s	remaining: 18.4s
96:	learn: 0.0076596	total: 11.1s	remaining: 18.4s
97:	learn: 0.0076210	total: 11.3s	remaining: 18.4s
98:	learn: 0.0074463	total: 11.4s	remaining: 18.4s
99:	learn: 0.0073228	total: 11.6s	remaining: 18.3s
100:	learn: 0.0071367	total: 11.8s	remaining: 18.3s
101:	learn: 0.0069799	total: 11.9s	remaining: 18.3s
102:	learn: 0.0069133	total: 12.1s	remaining: 18.3s
103:	learn: 0.0068105	total: 12.3s	remaining: 18.2s
104:	learn: 0.0067140	total: 12.5s	remaining: 18.2s
105:	learn: 0.0066686	total: 12.7s	remaining: 18.2s
106:	learn: 0.0065688	total: 12.8s	remaining: 18.1s
107:	learn: 0.0065086	total: 13s	remaining: 18.1s
108:	learn: 0.0064432	total: 13.2s	remaining: 18s
109:	learn: 0.0063293	total: 13.3s	remaining: 17.9s
110:	learn: 0.0062242	total: 13.5s	remaining: 17.9s
111:	learn: 0.0061216	total: 13.6s	remaining: 17.8s
112:	learn: 0.0060797	total: 13.7s	remaining: 17.6s
113:	learn: 0.0059782	total: 13.8s	remaining: 17.5s
114:	learn: 0.0058958	total: 13.9s	remaining: 17.3s
115:	learn: 0.0058683	total: 14s	remaining: 17.2s
116:	learn: 0.0058382	total: 14.1s	remaining: 17s
117:	learn: 0.0056895	total: 14.2s	remaining: 16.9s
118:	learn: 0.0056326	total: 14.3s	remaining: 16.7s
119:	learn: 0.0056041	total: 14.4s	remaining: 16.6s
120:	learn: 0.0055312	total: 14.5s	remaining: 16.4s
121:	learn: 0.0054538	total: 14.6s	remaining: 16.3s
122:	learn: 0.0053609	total: 14.7s	remaining: 16.2s
123:	learn: 0.0052717	total: 14.8s	remaining: 16s
124:	learn: 0.0052135	total: 14.9s	remaining: 15.9s
125:	learn: 0.0051402	total: 15s	remaining: 15.7s
126:	learn: 0.0050383	total: 15.1s	remaining: 15.6s
127:	learn: 0.0050058	total: 15.2s	remaining: 15.4s
128:	learn: 0.0050058	total: 15.3s	remaining: 15.3s
129:	learn: 0.0050058	total: 15.4s	remaining: 15.1s
130:	learn: 0.0049896	total: 15.4s	remaining: 15s
131:	learn: 0.0049515	total: 15.5s	remaining: 14.8s
132:	learn: 0.0049184	total: 15.6s	remaining: 14.7s
133:	learn: 0.0048734	total: 15.7s	remaining: 14.6s
134:	learn: 0.0048054	total: 15.8s	remaining: 14.4s
135:	learn: 0.0047915	total: 15.9s	remaining: 14.3s
136:	learn: 0.0047596	total: 16s	remaining: 14.1s
137:	learn: 0.0046437	total: 16.1s	remaining: 14s
138:	learn: 0.0045938	total: 16.2s	remaining: 13.9s
139:	learn: 0.0045698	total: 16.3s	remaining: 13.7s
140:	learn: 0.0045532	total: 16.4s	remaining: 13.6s
141:	learn: 0.0045395	total: 16.5s	remaining: 13.5s
142:	learn: 0.0045018	total: 16.6s	remaining: 13.3s
143:	learn: 0.0044797	total: 16.7s	remaining: 13.2s
144:	learn: 0.0044542	total: 16.8s	remaining: 13.1s
145:	learn: 0.0043812	total: 16.9s	remaining: 12.9s
146:	learn: 0.0043441	total: 17s	remaining: 12.8s
147:	learn: 0.0042651	total: 17.1s	remaining: 12.7s
148:	learn: 0.0042013	total: 17.1s	remaining: 12.5s
149:	learn: 0.0041367	total: 17.3s	remaining: 12.5s
150:	learn: 0.0041367	total: 17.4s	remaining: 12.3s
151:	learn: 0.0040490	total: 17.5s	remaining: 12.2s
152:	learn: 0.0039914	total: 17.6s	remaining: 12.1s
153:	learn: 0.0039616	total: 17.7s	remaining: 12s
154:	learn: 0.0039397	total: 17.8s	remaining: 11.8s
155:	learn: 0.0038967	total: 17.9s	remaining: 11.7s
156:	learn: 0.0038967	total: 18s	remaining: 11.6s
157:	learn: 0.0038627	total: 18s	remaining: 11.4s
158:	learn: 0.0038626	total: 18.1s	remaining: 11.3s
159:	learn: 0.0038483	total: 18.2s	remaining: 11.2s
160:	learn: 0.0038483	total: 18.3s	remaining: 11s
161:	learn: 0.0038483	total: 18.4s	remaining: 10.9s
162:	learn: 0.0038032	total: 18.5s	remaining: 10.8s
163:	learn: 0.0038032	total: 18.5s	remaining: 10.6s
164:	learn: 0.0038032	total: 18.6s	remaining: 10.5s
165:	learn: 0.0038032	total: 18.7s	remaining: 10.4s
166:	learn: 0.0038032	total: 18.8s	remaining: 10.2s
167:	learn: 0.0037680	total: 18.9s	remaining: 10.1s
168:	learn: 0.0037680	total: 19s	remaining: 9.99s
169:	learn: 0.0037514	total: 19.1s	remaining: 9.86s
170:	learn: 0.0036514	total: 19.2s	remaining: 9.75s
171:	learn: 0.0036514	total: 19.2s	remaining: 9.62s
172:	learn: 0.0036276	total: 19.3s	remaining: 9.49s
173:	learn: 0.0036061	total: 19.4s	remaining: 9.38s
174:	learn: 0.0035810	total: 19.5s	remaining: 9.25s
175:	learn: 0.0035266	total: 19.6s	remaining: 9.13s
176:	learn: 0.0035266	total: 19.7s	remaining: 9.01s
177:	learn: 0.0035267	total: 19.8s	remaining: 8.89s
178:	learn: 0.0035126	total: 19.9s	remaining: 8.76s
179:	learn: 0.0034802	total: 20s	remaining: 8.66s
180:	learn: 0.0034802	total: 20s	remaining: 8.53s
181:	learn: 0.0034802	total: 20.1s	remaining: 8.39s
182:	learn: 0.0034802	total: 20.2s	remaining: 8.26s
183:	learn: 0.0034802	total: 20.2s	remaining: 8.14s
184:	learn: 0.0034467	total: 20.3s	remaining: 8.02s
185:	learn: 0.0034467	total: 20.4s	remaining: 7.9s
186:	learn: 0.0034466	total: 20.5s	remaining: 7.79s
187:	learn: 0.0034467	total: 20.6s	remaining: 7.66s
188:	learn: 0.0034466	total: 20.6s	remaining: 7.54s
189:	learn: 0.0034466	total: 20.7s	remaining: 7.41s
190:	learn: 0.0034466	total: 20.8s	remaining: 7.3s
191:	learn: 0.0034466	total: 20.9s	remaining: 7.18s
192:	learn: 0.0034467	total: 21s	remaining: 7.06s
193:	learn: 0.0034466	total: 21s	remaining: 6.93s
194:	learn: 0.0034258	total: 21.1s	remaining: 6.83s
195:	learn: 0.0034257	total: 21.2s	remaining: 6.7s
196:	learn: 0.0034257	total: 21.3s	remaining: 6.59s
197:	learn: 0.0034257	total: 21.3s	remaining: 6.47s
198:	learn: 0.0034257	total: 21.4s	remaining: 6.35s
199:	learn: 0.0034256	total: 21.5s	remaining: 6.23s
200:	learn: 0.0034256	total: 21.6s	remaining: 6.12s
201:	learn: 0.0034256	total: 21.6s	remaining: 6s
202:	learn: 0.0034255	total: 21.7s	remaining: 5.88s
203:	learn: 0.0034256	total: 21.8s	remaining: 5.76s
204:	learn: 0.0034256	total: 21.9s	remaining: 5.66s
205:	learn: 0.0034255	total: 22s	remaining: 5.55s
206:	learn: 0.0034255	total: 22.1s	remaining: 5.43s
207:	learn: 0.0034255	total: 22.1s	remaining: 5.32s
208:	learn: 0.0034255	total: 22.2s	remaining: 5.21s
209:	learn: 0.0034255	total: 22.3s	remaining: 5.09s
210:	learn: 0.0034254	total: 22.3s	remaining: 4.98s
211:	learn: 0.0034254	total: 22.4s	remaining: 4.87s
212:	learn: 0.0034254	total: 22.5s	remaining: 4.75s
213:	learn: 0.0034254	total: 22.6s	remaining: 4.64s
214:	learn: 0.0034254	total: 22.6s	remaining: 4.53s
215:	learn: 0.0034254	total: 22.7s	remaining: 4.42s
216:	learn: 0.0034254	total: 22.8s	remaining: 4.3s
217:	learn: 0.0034253	total: 22.9s	remaining: 4.2s
218:	learn: 0.0034253	total: 22.9s	remaining: 4.09s
219:	learn: 0.0034253	total: 23s	remaining: 3.97s
220:	learn: 0.0034253	total: 23.1s	remaining: 3.86s
221:	learn: 0.0034252	total: 23.2s	remaining: 3.76s
222:	learn: 0.0034252	total: 23.2s	remaining: 3.65s
223:	learn: 0.0034252	total: 23.3s	remaining: 3.54s
224:	learn: 0.0034253	total: 23.4s	remaining: 3.42s
225:	learn: 0.0034154	total: 23.5s	remaining: 3.32s
226:	learn: 0.0034153	total: 23.5s	remaining: 3.21s
227:	learn: 0.0033952	total: 23.7s	remaining: 3.12s
228:	learn: 0.0033307	total: 23.9s	remaining: 3.02s
229:	learn: 0.0033152	total: 24s	remaining: 2.93s
230:	learn: 0.0032576	total: 24.2s	remaining: 2.83s
231:	learn: 0.0032576	total: 24.4s	remaining: 2.73s
232:	learn: 0.0032576	total: 24.5s	remaining: 2.63s
233:	learn: 0.0032428	total: 24.7s	remaining: 2.53s
234:	learn: 0.0032428	total: 24.8s	remaining: 2.43s
235:	learn: 0.0032428	total: 25s	remaining: 2.33s
236:	learn: 0.0032428	total: 25.1s	remaining: 2.22s
237:	learn: 0.0032428	total: 25.2s	remaining: 2.12s
238:	learn: 0.0032428	total: 25.4s	remaining: 2.02s
239:	learn: 0.0032196	total: 25.5s	remaining: 1.91s
240:	learn: 0.0032035	total: 25.7s	remaining: 1.81s
241:	learn: 0.0031831	total: 25.9s	remaining: 1.71s
242:	learn: 0.0031831	total: 26s	remaining: 1.6s
243:	learn: 0.0031831	total: 26.1s	remaining: 1.5s
244:	learn: 0.0031830	total: 26.3s	remaining: 1.39s
245:	learn: 0.0031830	total: 26.4s	remaining: 1.29s
246:	learn: 0.0031830	total: 26.6s	remaining: 1.18s
247:	learn: 0.0031830	total: 26.7s	remaining: 1.08s
248:	learn: 0.0031678	total: 26.9s	remaining: 971ms
249:	learn: 0.0031678	total: 27s	remaining: 865ms
250:	learn: 0.0031678	total: 27.2s	remaining: 758ms
251:	learn: 0.0031678	total: 27.3s	remaining: 650ms
252:	learn: 0.0031678	total: 27.4s	remaining: 542ms
253:	learn: 0.0031677	total: 27.6s	remaining: 434ms
254:	learn: 0.0031677	total: 27.7s	remaining: 326ms
255:	learn: 0.0031677	total: 27.8s	remaining: 217ms
256:	learn: 0.0031677	total: 28s	remaining: 109ms
257:	learn: 0.0031678	total: 28.1s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.69
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.84
 - F1-Score_Train: 99.84
 - Precision_Test: 28.50
 - Recall_Test: 88.89
 - AUPRC_Test: 78.53
 - Accuracy_Test: 99.61
 - F1-Score_Test: 43.16
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 258
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.46
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5001807	total: 84.1ms	remaining: 21.6s
1:	learn: 0.3407116	total: 169ms	remaining: 21.7s
2:	learn: 0.2441670	total: 261ms	remaining: 22.2s
3:	learn: 0.1795088	total: 379ms	remaining: 24.1s
4:	learn: 0.1503516	total: 468ms	remaining: 23.7s
5:	learn: 0.1289516	total: 546ms	remaining: 22.9s
6:	learn: 0.1131999	total: 671ms	remaining: 24.1s
7:	learn: 0.1019294	total: 760ms	remaining: 23.7s
8:	learn: 0.0905212	total: 845ms	remaining: 23.4s
9:	learn: 0.0831620	total: 953ms	remaining: 23.6s
10:	learn: 0.0777026	total: 1.04s	remaining: 23.3s
11:	learn: 0.0734646	total: 1.12s	remaining: 22.9s
12:	learn: 0.0671753	total: 1.25s	remaining: 23.5s
13:	learn: 0.0627230	total: 1.34s	remaining: 23.3s
14:	learn: 0.0587176	total: 1.43s	remaining: 23.1s
15:	learn: 0.0553466	total: 1.55s	remaining: 23.4s
16:	learn: 0.0522101	total: 1.64s	remaining: 23.2s
17:	learn: 0.0503517	total: 1.73s	remaining: 23.1s
18:	learn: 0.0462889	total: 1.85s	remaining: 23.3s
19:	learn: 0.0446371	total: 1.93s	remaining: 23s
20:	learn: 0.0427649	total: 2.02s	remaining: 22.8s
21:	learn: 0.0413720	total: 2.12s	remaining: 22.8s
22:	learn: 0.0393656	total: 2.23s	remaining: 22.8s
23:	learn: 0.0378401	total: 2.33s	remaining: 22.8s
24:	learn: 0.0354678	total: 2.44s	remaining: 22.8s
25:	learn: 0.0342258	total: 2.52s	remaining: 22.5s
26:	learn: 0.0325363	total: 2.68s	remaining: 22.9s
27:	learn: 0.0315022	total: 2.78s	remaining: 22.8s
28:	learn: 0.0301272	total: 2.87s	remaining: 22.7s
29:	learn: 0.0293432	total: 2.98s	remaining: 22.7s
30:	learn: 0.0283107	total: 3.07s	remaining: 22.5s
31:	learn: 0.0276439	total: 3.15s	remaining: 22.3s
32:	learn: 0.0266027	total: 3.27s	remaining: 22.3s
33:	learn: 0.0260604	total: 3.36s	remaining: 22.1s
34:	learn: 0.0250517	total: 3.45s	remaining: 22s
35:	learn: 0.0245938	total: 3.57s	remaining: 22s
36:	learn: 0.0238208	total: 3.65s	remaining: 21.8s
37:	learn: 0.0229250	total: 3.76s	remaining: 21.8s
38:	learn: 0.0220045	total: 3.88s	remaining: 21.8s
39:	learn: 0.0215343	total: 3.96s	remaining: 21.6s
40:	learn: 0.0207560	total: 4.05s	remaining: 21.5s
41:	learn: 0.0200400	total: 4.17s	remaining: 21.5s
42:	learn: 0.0194565	total: 4.26s	remaining: 21.3s
43:	learn: 0.0190101	total: 4.35s	remaining: 21.2s
44:	learn: 0.0184295	total: 4.48s	remaining: 21.2s
45:	learn: 0.0180246	total: 4.56s	remaining: 21s
46:	learn: 0.0175724	total: 4.66s	remaining: 20.9s
47:	learn: 0.0172052	total: 4.79s	remaining: 21s
48:	learn: 0.0168440	total: 4.87s	remaining: 20.8s
49:	learn: 0.0166517	total: 4.96s	remaining: 20.6s
50:	learn: 0.0163061	total: 5.07s	remaining: 20.6s
51:	learn: 0.0159938	total: 5.15s	remaining: 20.4s
52:	learn: 0.0154620	total: 5.25s	remaining: 20.3s
53:	learn: 0.0151023	total: 5.38s	remaining: 20.3s
54:	learn: 0.0146332	total: 5.47s	remaining: 20.2s
55:	learn: 0.0143976	total: 5.55s	remaining: 20s
56:	learn: 0.0140005	total: 5.67s	remaining: 20s
57:	learn: 0.0136838	total: 5.76s	remaining: 19.9s
58:	learn: 0.0134964	total: 5.85s	remaining: 19.7s
59:	learn: 0.0131529	total: 5.97s	remaining: 19.7s
60:	learn: 0.0129380	total: 6.06s	remaining: 19.6s
61:	learn: 0.0124733	total: 6.14s	remaining: 19.4s
62:	learn: 0.0123737	total: 6.26s	remaining: 19.4s
63:	learn: 0.0121697	total: 6.34s	remaining: 19.2s
64:	learn: 0.0118785	total: 6.42s	remaining: 19.1s
65:	learn: 0.0117022	total: 6.55s	remaining: 19s
66:	learn: 0.0115060	total: 6.63s	remaining: 18.9s
67:	learn: 0.0113319	total: 6.72s	remaining: 18.8s
68:	learn: 0.0109050	total: 6.85s	remaining: 18.8s
69:	learn: 0.0107850	total: 6.93s	remaining: 18.6s
70:	learn: 0.0105438	total: 7.02s	remaining: 18.5s
71:	learn: 0.0104576	total: 7.13s	remaining: 18.4s
72:	learn: 0.0102544	total: 7.22s	remaining: 18.3s
73:	learn: 0.0100230	total: 7.32s	remaining: 18.2s
74:	learn: 0.0098120	total: 7.49s	remaining: 18.3s
75:	learn: 0.0095693	total: 7.58s	remaining: 18.1s
76:	learn: 0.0094302	total: 7.66s	remaining: 18s
77:	learn: 0.0092329	total: 7.77s	remaining: 17.9s
78:	learn: 0.0090116	total: 7.88s	remaining: 17.9s
79:	learn: 0.0088145	total: 7.97s	remaining: 17.7s
80:	learn: 0.0087813	total: 8.08s	remaining: 17.7s
81:	learn: 0.0085610	total: 8.17s	remaining: 17.5s
82:	learn: 0.0084244	total: 8.27s	remaining: 17.4s
83:	learn: 0.0082978	total: 8.42s	remaining: 17.4s
84:	learn: 0.0080707	total: 8.58s	remaining: 17.5s
85:	learn: 0.0079943	total: 8.73s	remaining: 17.5s
86:	learn: 0.0079409	total: 8.87s	remaining: 17.4s
87:	learn: 0.0079093	total: 9.05s	remaining: 17.5s
88:	learn: 0.0077430	total: 9.21s	remaining: 17.5s
89:	learn: 0.0077101	total: 9.38s	remaining: 17.5s
90:	learn: 0.0075367	total: 9.54s	remaining: 17.5s
91:	learn: 0.0075022	total: 9.73s	remaining: 17.6s
92:	learn: 0.0074464	total: 9.88s	remaining: 17.5s
93:	learn: 0.0073138	total: 10.1s	remaining: 17.6s
94:	learn: 0.0071766	total: 10.2s	remaining: 17.6s
95:	learn: 0.0070138	total: 10.4s	remaining: 17.6s
96:	learn: 0.0068725	total: 10.6s	remaining: 17.6s
97:	learn: 0.0068298	total: 10.8s	remaining: 17.6s
98:	learn: 0.0066815	total: 10.9s	remaining: 17.6s
99:	learn: 0.0065785	total: 11.1s	remaining: 17.6s
100:	learn: 0.0064312	total: 11.3s	remaining: 17.5s
101:	learn: 0.0063542	total: 11.5s	remaining: 17.5s
102:	learn: 0.0062665	total: 11.6s	remaining: 17.5s
103:	learn: 0.0061125	total: 11.8s	remaining: 17.5s
104:	learn: 0.0060096	total: 12s	remaining: 17.5s
105:	learn: 0.0059516	total: 12.2s	remaining: 17.5s
106:	learn: 0.0058819	total: 12.3s	remaining: 17.4s
107:	learn: 0.0058120	total: 12.5s	remaining: 17.4s
108:	learn: 0.0057158	total: 12.7s	remaining: 17.3s
109:	learn: 0.0055986	total: 12.8s	remaining: 17.3s
110:	learn: 0.0054719	total: 13s	remaining: 17.2s
111:	learn: 0.0054229	total: 13.2s	remaining: 17.2s
112:	learn: 0.0053307	total: 13.4s	remaining: 17.2s
113:	learn: 0.0053067	total: 13.5s	remaining: 17.1s
114:	learn: 0.0052402	total: 13.7s	remaining: 17s
115:	learn: 0.0051981	total: 13.9s	remaining: 17s
116:	learn: 0.0051000	total: 14.1s	remaining: 16.9s
117:	learn: 0.0050205	total: 14.2s	remaining: 16.9s
118:	learn: 0.0049124	total: 14.4s	remaining: 16.8s
119:	learn: 0.0048271	total: 14.6s	remaining: 16.8s
120:	learn: 0.0047888	total: 14.7s	remaining: 16.6s
121:	learn: 0.0047295	total: 14.8s	remaining: 16.5s
122:	learn: 0.0046625	total: 14.9s	remaining: 16.3s
123:	learn: 0.0046235	total: 15s	remaining: 16.2s
124:	learn: 0.0045856	total: 15.1s	remaining: 16s
125:	learn: 0.0045444	total: 15.2s	remaining: 15.9s
126:	learn: 0.0045086	total: 15.3s	remaining: 15.8s
127:	learn: 0.0044272	total: 15.4s	remaining: 15.6s
128:	learn: 0.0043873	total: 15.5s	remaining: 15.5s
129:	learn: 0.0043052	total: 15.6s	remaining: 15.4s
130:	learn: 0.0042086	total: 15.7s	remaining: 15.2s
131:	learn: 0.0041580	total: 15.8s	remaining: 15.1s
132:	learn: 0.0041026	total: 15.9s	remaining: 14.9s
133:	learn: 0.0040536	total: 16s	remaining: 14.8s
134:	learn: 0.0040145	total: 16.1s	remaining: 14.7s
135:	learn: 0.0040145	total: 16.2s	remaining: 14.5s
136:	learn: 0.0040062	total: 16.2s	remaining: 14.3s
137:	learn: 0.0040062	total: 16.3s	remaining: 14.2s
138:	learn: 0.0039533	total: 16.4s	remaining: 14.1s
139:	learn: 0.0038732	total: 16.5s	remaining: 13.9s
140:	learn: 0.0038499	total: 16.6s	remaining: 13.8s
141:	learn: 0.0038073	total: 16.7s	remaining: 13.7s
142:	learn: 0.0038019	total: 16.8s	remaining: 13.5s
143:	learn: 0.0037508	total: 16.9s	remaining: 13.4s
144:	learn: 0.0037508	total: 17s	remaining: 13.2s
145:	learn: 0.0037508	total: 17.1s	remaining: 13.1s
146:	learn: 0.0037004	total: 17.2s	remaining: 13s
147:	learn: 0.0036657	total: 17.3s	remaining: 12.8s
148:	learn: 0.0036657	total: 17.4s	remaining: 12.7s
149:	learn: 0.0036293	total: 17.5s	remaining: 12.6s
150:	learn: 0.0035960	total: 17.5s	remaining: 12.4s
151:	learn: 0.0035960	total: 17.6s	remaining: 12.3s
152:	learn: 0.0035727	total: 17.7s	remaining: 12.2s
153:	learn: 0.0035295	total: 17.8s	remaining: 12s
154:	learn: 0.0035215	total: 17.9s	remaining: 11.9s
155:	learn: 0.0034880	total: 18s	remaining: 11.8s
156:	learn: 0.0034426	total: 18.1s	remaining: 11.7s
157:	learn: 0.0034425	total: 18.2s	remaining: 11.5s
158:	learn: 0.0034364	total: 18.3s	remaining: 11.4s
159:	learn: 0.0034073	total: 18.4s	remaining: 11.3s
160:	learn: 0.0033766	total: 18.5s	remaining: 11.2s
161:	learn: 0.0033340	total: 18.6s	remaining: 11s
162:	learn: 0.0033073	total: 18.7s	remaining: 10.9s
163:	learn: 0.0032790	total: 18.8s	remaining: 10.8s
164:	learn: 0.0032652	total: 18.9s	remaining: 10.7s
165:	learn: 0.0032652	total: 19s	remaining: 10.5s
166:	learn: 0.0032652	total: 19.1s	remaining: 10.4s
167:	learn: 0.0032528	total: 19.1s	remaining: 10.3s
168:	learn: 0.0032528	total: 19.2s	remaining: 10.1s
169:	learn: 0.0032528	total: 19.3s	remaining: 10s
170:	learn: 0.0032528	total: 19.4s	remaining: 9.86s
171:	learn: 0.0032527	total: 19.4s	remaining: 9.72s
172:	learn: 0.0032528	total: 19.5s	remaining: 9.6s
173:	learn: 0.0032397	total: 19.6s	remaining: 9.48s
174:	learn: 0.0031862	total: 19.7s	remaining: 9.36s
175:	learn: 0.0031862	total: 19.8s	remaining: 9.23s
176:	learn: 0.0031547	total: 19.9s	remaining: 9.11s
177:	learn: 0.0031547	total: 20s	remaining: 8.98s
178:	learn: 0.0031229	total: 20.1s	remaining: 8.87s
179:	learn: 0.0030764	total: 20.2s	remaining: 8.75s
180:	learn: 0.0030764	total: 20.3s	remaining: 8.62s
181:	learn: 0.0030764	total: 20.4s	remaining: 8.5s
182:	learn: 0.0030177	total: 20.5s	remaining: 8.38s
183:	learn: 0.0030177	total: 20.5s	remaining: 8.26s
184:	learn: 0.0030177	total: 20.6s	remaining: 8.14s
185:	learn: 0.0030177	total: 20.7s	remaining: 8.01s
186:	learn: 0.0030177	total: 20.8s	remaining: 7.88s
187:	learn: 0.0030177	total: 20.8s	remaining: 7.75s
188:	learn: 0.0030177	total: 20.9s	remaining: 7.64s
189:	learn: 0.0030177	total: 21s	remaining: 7.51s
190:	learn: 0.0030067	total: 21.1s	remaining: 7.39s
191:	learn: 0.0030067	total: 21.1s	remaining: 7.26s
192:	learn: 0.0030067	total: 21.2s	remaining: 7.15s
193:	learn: 0.0030067	total: 21.3s	remaining: 7.02s
194:	learn: 0.0030067	total: 21.4s	remaining: 6.91s
195:	learn: 0.0030067	total: 21.4s	remaining: 6.78s
196:	learn: 0.0030067	total: 21.5s	remaining: 6.66s
197:	learn: 0.0030067	total: 21.6s	remaining: 6.54s
198:	learn: 0.0030066	total: 21.7s	remaining: 6.43s
199:	learn: 0.0030066	total: 21.7s	remaining: 6.31s
200:	learn: 0.0030066	total: 21.8s	remaining: 6.19s
201:	learn: 0.0030066	total: 21.9s	remaining: 6.07s
202:	learn: 0.0030066	total: 22s	remaining: 5.95s
203:	learn: 0.0030066	total: 22s	remaining: 5.83s
204:	learn: 0.0030066	total: 22.1s	remaining: 5.72s
205:	learn: 0.0030066	total: 22.2s	remaining: 5.6s
206:	learn: 0.0030066	total: 22.3s	remaining: 5.48s
207:	learn: 0.0030066	total: 22.3s	remaining: 5.37s
208:	learn: 0.0030065	total: 22.4s	remaining: 5.25s
209:	learn: 0.0030065	total: 22.5s	remaining: 5.14s
210:	learn: 0.0030065	total: 22.5s	remaining: 5.02s
211:	learn: 0.0030065	total: 22.6s	remaining: 4.91s
212:	learn: 0.0030065	total: 22.8s	remaining: 4.81s
213:	learn: 0.0030065	total: 22.8s	remaining: 4.69s
214:	learn: 0.0030065	total: 22.9s	remaining: 4.58s
215:	learn: 0.0030065	total: 23s	remaining: 4.47s
216:	learn: 0.0030065	total: 23s	remaining: 4.35s
217:	learn: 0.0030065	total: 23.1s	remaining: 4.24s
218:	learn: 0.0030065	total: 23.2s	remaining: 4.13s
219:	learn: 0.0030065	total: 23.3s	remaining: 4.02s
220:	learn: 0.0030064	total: 23.3s	remaining: 3.9s
221:	learn: 0.0030064	total: 23.4s	remaining: 3.79s
222:	learn: 0.0030064	total: 23.5s	remaining: 3.69s
223:	learn: 0.0030064	total: 23.5s	remaining: 3.57s
224:	learn: 0.0030064	total: 23.6s	remaining: 3.46s
225:	learn: 0.0030064	total: 23.7s	remaining: 3.35s
226:	learn: 0.0030064	total: 23.8s	remaining: 3.25s
227:	learn: 0.0030064	total: 23.8s	remaining: 3.14s
228:	learn: 0.0030064	total: 23.9s	remaining: 3.03s
229:	learn: 0.0030064	total: 24s	remaining: 2.92s
230:	learn: 0.0030064	total: 24.1s	remaining: 2.81s
231:	learn: 0.0030064	total: 24.1s	remaining: 2.7s
232:	learn: 0.0030064	total: 24.2s	remaining: 2.6s
233:	learn: 0.0030064	total: 24.3s	remaining: 2.49s
234:	learn: 0.0030064	total: 24.3s	remaining: 2.38s
235:	learn: 0.0030064	total: 24.4s	remaining: 2.28s
236:	learn: 0.0030063	total: 24.5s	remaining: 2.17s
237:	learn: 0.0030063	total: 24.6s	remaining: 2.07s
238:	learn: 0.0030063	total: 24.7s	remaining: 1.96s
239:	learn: 0.0030063	total: 24.8s	remaining: 1.86s
240:	learn: 0.0030063	total: 25s	remaining: 1.76s
241:	learn: 0.0030063	total: 25.1s	remaining: 1.66s
242:	learn: 0.0030063	total: 25.3s	remaining: 1.56s
243:	learn: 0.0030063	total: 25.4s	remaining: 1.46s
244:	learn: 0.0030063	total: 25.5s	remaining: 1.35s
245:	learn: 0.0030063	total: 25.7s	remaining: 1.25s
246:	learn: 0.0030063	total: 25.8s	remaining: 1.15s
247:	learn: 0.0030063	total: 26s	remaining: 1.05s
248:	learn: 0.0030063	total: 26.1s	remaining: 943ms
249:	learn: 0.0030063	total: 26.2s	remaining: 839ms
250:	learn: 0.0030063	total: 26.3s	remaining: 734ms
251:	learn: 0.0030063	total: 26.5s	remaining: 631ms
252:	learn: 0.0030062	total: 26.6s	remaining: 526ms
253:	learn: 0.0030062	total: 26.7s	remaining: 421ms
254:	learn: 0.0030062	total: 26.9s	remaining: 316ms
255:	learn: 0.0030062	total: 27s	remaining: 211ms
256:	learn: 0.0030062	total: 27.1s	remaining: 106ms
257:	learn: 0.0030062	total: 27.3s	remaining: 0us
[I 2024-12-19 15:14:16,402] Trial 48 finished with value: 78.83491622763522 and parameters: {'learning_rate': 0.08763199259953189, 'max_depth': 6, 'n_estimators': 258, 'scale_pos_weight': 6.459984391212722}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.71
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.85
 - F1-Score_Train: 99.85
 - Precision_Test: 26.85
 - Recall_Test: 86.51
 - AUPRC_Test: 78.64
 - Accuracy_Test: 99.58
 - F1-Score_Test: 40.98
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 258
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.09
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.46
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 78.8349

🔍 Optimizando hiperparámetros para CatBoost con Optuna...

🔄 Fold 1: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5359939	total: 89.9ms	remaining: 25s
1:	learn: 0.3997972	total: 182ms	remaining: 25.2s
2:	learn: 0.3114907	total: 272ms	remaining: 25.1s
3:	learn: 0.2358218	total: 395ms	remaining: 27.2s
4:	learn: 0.1838363	total: 505ms	remaining: 27.7s
5:	learn: 0.1480308	total: 605ms	remaining: 27.5s
6:	learn: 0.1274817	total: 724ms	remaining: 28.1s
7:	learn: 0.1090974	total: 819ms	remaining: 27.7s
8:	learn: 0.0963457	total: 933ms	remaining: 28s
9:	learn: 0.0867685	total: 1.02s	remaining: 27.5s
10:	learn: 0.0793795	total: 1.14s	remaining: 27.8s
11:	learn: 0.0724165	total: 1.24s	remaining: 27.5s
12:	learn: 0.0683274	total: 1.32s	remaining: 27s
13:	learn: 0.0639359	total: 1.43s	remaining: 27.1s
14:	learn: 0.0590166	total: 1.54s	remaining: 27.1s
15:	learn: 0.0562654	total: 1.63s	remaining: 26.8s
16:	learn: 0.0521613	total: 1.75s	remaining: 27s
17:	learn: 0.0483039	total: 1.83s	remaining: 26.6s
18:	learn: 0.0449725	total: 1.92s	remaining: 26.3s
19:	learn: 0.0424451	total: 2.04s	remaining: 26.4s
20:	learn: 0.0408002	total: 2.13s	remaining: 26.1s
21:	learn: 0.0395162	total: 2.22s	remaining: 25.9s
22:	learn: 0.0382911	total: 2.32s	remaining: 25.9s
23:	learn: 0.0361706	total: 2.42s	remaining: 25.7s
24:	learn: 0.0351918	total: 2.54s	remaining: 25.8s
25:	learn: 0.0340157	total: 2.65s	remaining: 25.7s
26:	learn: 0.0325938	total: 2.74s	remaining: 25.6s
27:	learn: 0.0313319	total: 2.85s	remaining: 25.6s
28:	learn: 0.0306700	total: 2.94s	remaining: 25.3s
29:	learn: 0.0298350	total: 3.02s	remaining: 25.1s
30:	learn: 0.0290763	total: 3.17s	remaining: 25.3s
31:	learn: 0.0283487	total: 3.27s	remaining: 25.2s
32:	learn: 0.0273844	total: 3.36s	remaining: 25s
33:	learn: 0.0266601	total: 3.47s	remaining: 25s
34:	learn: 0.0258974	total: 3.58s	remaining: 25s
35:	learn: 0.0251015	total: 3.67s	remaining: 24.8s
36:	learn: 0.0244867	total: 3.78s	remaining: 24.7s
37:	learn: 0.0237765	total: 3.87s	remaining: 24.5s
38:	learn: 0.0233698	total: 3.95s	remaining: 24.3s
39:	learn: 0.0228438	total: 4.07s	remaining: 24.3s
40:	learn: 0.0220629	total: 4.16s	remaining: 24.1s
41:	learn: 0.0215528	total: 4.25s	remaining: 24s
42:	learn: 0.0208375	total: 4.36s	remaining: 24s
43:	learn: 0.0204089	total: 4.45s	remaining: 23.8s
44:	learn: 0.0197548	total: 4.54s	remaining: 23.6s
45:	learn: 0.0193606	total: 4.67s	remaining: 23.7s
46:	learn: 0.0189584	total: 4.75s	remaining: 23.5s
47:	learn: 0.0184022	total: 4.85s	remaining: 23.3s
48:	learn: 0.0179786	total: 4.96s	remaining: 23.3s
49:	learn: 0.0175836	total: 5.06s	remaining: 23.2s
50:	learn: 0.0172149	total: 5.15s	remaining: 23s
51:	learn: 0.0168691	total: 5.27s	remaining: 23s
52:	learn: 0.0165213	total: 5.35s	remaining: 22.8s
53:	learn: 0.0162189	total: 5.44s	remaining: 22.7s
54:	learn: 0.0159632	total: 5.55s	remaining: 22.6s
55:	learn: 0.0155435	total: 5.65s	remaining: 22.5s
56:	learn: 0.0152489	total: 5.74s	remaining: 22.4s
57:	learn: 0.0150293	total: 5.85s	remaining: 22.3s
58:	learn: 0.0147296	total: 5.93s	remaining: 22.1s
59:	learn: 0.0144112	total: 6.02s	remaining: 22s
60:	learn: 0.0140131	total: 6.14s	remaining: 21.9s
61:	learn: 0.0138430	total: 6.22s	remaining: 21.8s
62:	learn: 0.0136159	total: 6.31s	remaining: 21.6s
63:	learn: 0.0134445	total: 6.42s	remaining: 21.6s
64:	learn: 0.0132004	total: 6.5s	remaining: 21.4s
65:	learn: 0.0129371	total: 6.59s	remaining: 21.3s
66:	learn: 0.0126756	total: 6.72s	remaining: 21.3s
67:	learn: 0.0125036	total: 6.81s	remaining: 21.1s
68:	learn: 0.0122448	total: 6.9s	remaining: 21s
69:	learn: 0.0120256	total: 7.01s	remaining: 20.9s
70:	learn: 0.0117140	total: 7.1s	remaining: 20.8s
71:	learn: 0.0115426	total: 7.19s	remaining: 20.7s
72:	learn: 0.0113551	total: 7.3s	remaining: 20.6s
73:	learn: 0.0111907	total: 7.39s	remaining: 20.5s
74:	learn: 0.0108848	total: 7.47s	remaining: 20.3s
75:	learn: 0.0106670	total: 7.6s	remaining: 20.3s
76:	learn: 0.0105331	total: 7.7s	remaining: 20.2s
77:	learn: 0.0103298	total: 7.79s	remaining: 20.1s
78:	learn: 0.0101837	total: 7.9s	remaining: 20s
79:	learn: 0.0099417	total: 7.99s	remaining: 19.9s
80:	learn: 0.0098198	total: 8.08s	remaining: 19.8s
81:	learn: 0.0096437	total: 8.24s	remaining: 19.8s
82:	learn: 0.0094196	total: 8.33s	remaining: 19.7s
83:	learn: 0.0092248	total: 8.42s	remaining: 19.6s
84:	learn: 0.0091332	total: 8.54s	remaining: 19.5s
85:	learn: 0.0089637	total: 8.63s	remaining: 19.4s
86:	learn: 0.0088419	total: 8.73s	remaining: 19.3s
87:	learn: 0.0087432	total: 8.84s	remaining: 19.2s
88:	learn: 0.0086400	total: 8.93s	remaining: 19.1s
89:	learn: 0.0085406	total: 9.01s	remaining: 18.9s
90:	learn: 0.0084269	total: 9.13s	remaining: 18.9s
91:	learn: 0.0082856	total: 9.21s	remaining: 18.7s
92:	learn: 0.0081796	total: 9.3s	remaining: 18.6s
93:	learn: 0.0081205	total: 9.41s	remaining: 18.5s
94:	learn: 0.0080524	total: 9.49s	remaining: 18.4s
95:	learn: 0.0078025	total: 9.59s	remaining: 18.3s
96:	learn: 0.0077237	total: 9.75s	remaining: 18.3s
97:	learn: 0.0076016	total: 9.92s	remaining: 18.3s
98:	learn: 0.0074885	total: 10.1s	remaining: 18.3s
99:	learn: 0.0073868	total: 10.2s	remaining: 18.3s
100:	learn: 0.0072500	total: 10.4s	remaining: 18.3s
101:	learn: 0.0071677	total: 10.6s	remaining: 18.4s
102:	learn: 0.0070837	total: 10.8s	remaining: 18.4s
103:	learn: 0.0069994	total: 11s	remaining: 18.5s
104:	learn: 0.0068929	total: 11.1s	remaining: 18.5s
105:	learn: 0.0068150	total: 11.3s	remaining: 18.5s
106:	learn: 0.0067688	total: 11.5s	remaining: 18.4s
107:	learn: 0.0066744	total: 11.6s	remaining: 18.4s
108:	learn: 0.0065947	total: 11.8s	remaining: 18.4s
109:	learn: 0.0065464	total: 12s	remaining: 18.4s
110:	learn: 0.0065237	total: 12.1s	remaining: 18.4s
111:	learn: 0.0064167	total: 12.3s	remaining: 18.4s
112:	learn: 0.0062918	total: 12.5s	remaining: 18.4s
113:	learn: 0.0062517	total: 12.7s	remaining: 18.3s
114:	learn: 0.0061809	total: 12.9s	remaining: 18.3s
115:	learn: 0.0060911	total: 13s	remaining: 18.3s
116:	learn: 0.0059330	total: 13.2s	remaining: 18.3s
117:	learn: 0.0058655	total: 13.4s	remaining: 18.3s
118:	learn: 0.0057486	total: 13.6s	remaining: 18.3s
119:	learn: 0.0057021	total: 13.7s	remaining: 18.2s
120:	learn: 0.0056447	total: 13.9s	remaining: 18.1s
121:	learn: 0.0055922	total: 14.1s	remaining: 18.1s
122:	learn: 0.0055000	total: 14.3s	remaining: 18.1s
123:	learn: 0.0054483	total: 14.4s	remaining: 18s
124:	learn: 0.0053671	total: 14.6s	remaining: 18s
125:	learn: 0.0052830	total: 14.8s	remaining: 17.9s
126:	learn: 0.0052450	total: 14.9s	remaining: 17.9s
127:	learn: 0.0051541	total: 15.1s	remaining: 17.8s
128:	learn: 0.0050789	total: 15.3s	remaining: 17.8s
129:	learn: 0.0049855	total: 15.5s	remaining: 17.7s
130:	learn: 0.0048885	total: 15.7s	remaining: 17.7s
131:	learn: 0.0048326	total: 15.7s	remaining: 17.5s
132:	learn: 0.0047965	total: 15.8s	remaining: 17.4s
133:	learn: 0.0047545	total: 15.9s	remaining: 17.2s
134:	learn: 0.0046795	total: 16s	remaining: 17.1s
135:	learn: 0.0046333	total: 16.1s	remaining: 16.9s
136:	learn: 0.0045530	total: 16.2s	remaining: 16.8s
137:	learn: 0.0045233	total: 16.3s	remaining: 16.7s
138:	learn: 0.0045039	total: 16.4s	remaining: 16.5s
139:	learn: 0.0045039	total: 16.5s	remaining: 16.4s
140:	learn: 0.0044557	total: 16.6s	remaining: 16.2s
141:	learn: 0.0043959	total: 16.7s	remaining: 16.1s
142:	learn: 0.0043487	total: 16.8s	remaining: 16s
143:	learn: 0.0043112	total: 16.9s	remaining: 15.8s
144:	learn: 0.0042786	total: 17s	remaining: 15.7s
145:	learn: 0.0042499	total: 17.1s	remaining: 15.6s
146:	learn: 0.0042131	total: 17.2s	remaining: 15.4s
147:	learn: 0.0041595	total: 17.3s	remaining: 15.3s
148:	learn: 0.0041314	total: 17.4s	remaining: 15.2s
149:	learn: 0.0041164	total: 17.4s	remaining: 15s
150:	learn: 0.0041099	total: 17.5s	remaining: 14.9s
151:	learn: 0.0040479	total: 17.6s	remaining: 14.7s
152:	learn: 0.0040200	total: 17.7s	remaining: 14.6s
153:	learn: 0.0039904	total: 17.8s	remaining: 14.4s
154:	learn: 0.0039666	total: 17.9s	remaining: 14.3s
155:	learn: 0.0039528	total: 18s	remaining: 14.2s
156:	learn: 0.0039233	total: 18.1s	remaining: 14s
157:	learn: 0.0038995	total: 18.2s	remaining: 13.9s
158:	learn: 0.0038784	total: 18.3s	remaining: 13.8s
159:	learn: 0.0038225	total: 18.4s	remaining: 13.7s
160:	learn: 0.0038015	total: 18.5s	remaining: 13.5s
161:	learn: 0.0037484	total: 18.6s	remaining: 13.4s
162:	learn: 0.0037271	total: 18.7s	remaining: 13.3s
163:	learn: 0.0037013	total: 18.8s	remaining: 13.2s
164:	learn: 0.0036806	total: 18.8s	remaining: 13s
165:	learn: 0.0036462	total: 18.9s	remaining: 12.9s
166:	learn: 0.0036105	total: 19.1s	remaining: 12.8s
167:	learn: 0.0035572	total: 19.2s	remaining: 12.7s
168:	learn: 0.0035168	total: 19.3s	remaining: 12.5s
169:	learn: 0.0034942	total: 19.4s	remaining: 12.4s
170:	learn: 0.0034942	total: 19.4s	remaining: 12.3s
171:	learn: 0.0034942	total: 19.5s	remaining: 12.1s
172:	learn: 0.0034818	total: 19.6s	remaining: 12s
173:	learn: 0.0034645	total: 19.7s	remaining: 11.9s
174:	learn: 0.0034645	total: 19.8s	remaining: 11.8s
175:	learn: 0.0034644	total: 19.9s	remaining: 11.6s
176:	learn: 0.0034556	total: 20s	remaining: 11.5s
177:	learn: 0.0034393	total: 20s	remaining: 11.4s
178:	learn: 0.0033735	total: 20.2s	remaining: 11.3s
179:	learn: 0.0033499	total: 20.3s	remaining: 11.1s
180:	learn: 0.0033227	total: 20.3s	remaining: 11s
181:	learn: 0.0033005	total: 20.4s	remaining: 10.9s
182:	learn: 0.0032866	total: 20.5s	remaining: 10.8s
183:	learn: 0.0032866	total: 20.6s	remaining: 10.6s
184:	learn: 0.0032866	total: 20.7s	remaining: 10.5s
185:	learn: 0.0032680	total: 20.8s	remaining: 10.4s
186:	learn: 0.0032680	total: 20.8s	remaining: 10.3s
187:	learn: 0.0032680	total: 20.9s	remaining: 10.1s
188:	learn: 0.0032454	total: 21s	remaining: 10s
189:	learn: 0.0032148	total: 21.1s	remaining: 9.88s
190:	learn: 0.0031755	total: 21.2s	remaining: 9.77s
191:	learn: 0.0031755	total: 21.3s	remaining: 9.65s
192:	learn: 0.0031755	total: 21.4s	remaining: 9.52s
193:	learn: 0.0031589	total: 21.5s	remaining: 9.41s
194:	learn: 0.0031589	total: 21.5s	remaining: 9.28s
195:	learn: 0.0031262	total: 21.6s	remaining: 9.16s
196:	learn: 0.0030957	total: 21.8s	remaining: 9.05s
197:	learn: 0.0030696	total: 21.8s	remaining: 8.93s
198:	learn: 0.0030466	total: 21.9s	remaining: 8.81s
199:	learn: 0.0030276	total: 22s	remaining: 8.7s
200:	learn: 0.0030276	total: 22.1s	remaining: 8.57s
201:	learn: 0.0030276	total: 22.2s	remaining: 8.45s
202:	learn: 0.0030276	total: 22.2s	remaining: 8.32s
203:	learn: 0.0030276	total: 22.3s	remaining: 8.21s
204:	learn: 0.0030066	total: 22.4s	remaining: 8.1s
205:	learn: 0.0030066	total: 22.5s	remaining: 7.98s
206:	learn: 0.0029895	total: 22.6s	remaining: 7.87s
207:	learn: 0.0029698	total: 22.7s	remaining: 7.75s
208:	learn: 0.0029697	total: 22.8s	remaining: 7.63s
209:	learn: 0.0029406	total: 22.9s	remaining: 7.52s
210:	learn: 0.0029406	total: 22.9s	remaining: 7.39s
211:	learn: 0.0029406	total: 23s	remaining: 7.27s
212:	learn: 0.0029406	total: 23.1s	remaining: 7.16s
213:	learn: 0.0029406	total: 23.2s	remaining: 7.04s
214:	learn: 0.0029406	total: 23.3s	remaining: 6.92s
215:	learn: 0.0029405	total: 23.4s	remaining: 6.82s
216:	learn: 0.0029316	total: 23.5s	remaining: 6.7s
217:	learn: 0.0029316	total: 23.5s	remaining: 6.58s
218:	learn: 0.0029316	total: 23.6s	remaining: 6.47s
219:	learn: 0.0029316	total: 23.7s	remaining: 6.36s
220:	learn: 0.0029316	total: 23.8s	remaining: 6.25s
221:	learn: 0.0029316	total: 23.9s	remaining: 6.14s
222:	learn: 0.0029316	total: 24s	remaining: 6.02s
223:	learn: 0.0029316	total: 24s	remaining: 5.9s
224:	learn: 0.0029316	total: 24.1s	remaining: 5.79s
225:	learn: 0.0029316	total: 24.2s	remaining: 5.68s
226:	learn: 0.0029316	total: 24.3s	remaining: 5.56s
227:	learn: 0.0029316	total: 24.4s	remaining: 5.46s
228:	learn: 0.0029316	total: 24.5s	remaining: 5.34s
229:	learn: 0.0029316	total: 24.5s	remaining: 5.23s
230:	learn: 0.0029316	total: 24.6s	remaining: 5.12s
231:	learn: 0.0029316	total: 24.7s	remaining: 5.01s
232:	learn: 0.0029316	total: 24.8s	remaining: 4.89s
233:	learn: 0.0029316	total: 24.9s	remaining: 4.79s
234:	learn: 0.0029315	total: 25s	remaining: 4.67s
235:	learn: 0.0029316	total: 25s	remaining: 4.56s
236:	learn: 0.0029316	total: 25.1s	remaining: 4.45s
237:	learn: 0.0029316	total: 25.2s	remaining: 4.34s
238:	learn: 0.0029316	total: 25.3s	remaining: 4.23s
239:	learn: 0.0029316	total: 25.4s	remaining: 4.12s
240:	learn: 0.0029316	total: 25.5s	remaining: 4.01s
241:	learn: 0.0029316	total: 25.5s	remaining: 3.9s
242:	learn: 0.0029315	total: 25.6s	remaining: 3.8s
243:	learn: 0.0029315	total: 25.8s	remaining: 3.69s
244:	learn: 0.0029315	total: 25.9s	remaining: 3.59s
245:	learn: 0.0029315	total: 26s	remaining: 3.49s
246:	learn: 0.0029315	total: 26.2s	remaining: 3.39s
247:	learn: 0.0029315	total: 26.3s	remaining: 3.29s
248:	learn: 0.0029315	total: 26.5s	remaining: 3.19s
249:	learn: 0.0029315	total: 26.6s	remaining: 3.08s
250:	learn: 0.0029315	total: 26.7s	remaining: 2.98s
251:	learn: 0.0029315	total: 26.9s	remaining: 2.88s
252:	learn: 0.0029315	total: 27s	remaining: 2.78s
253:	learn: 0.0029315	total: 27.2s	remaining: 2.67s
254:	learn: 0.0029315	total: 27.3s	remaining: 2.57s
255:	learn: 0.0029315	total: 27.4s	remaining: 2.46s
256:	learn: 0.0029315	total: 27.6s	remaining: 2.36s
257:	learn: 0.0029315	total: 27.7s	remaining: 2.26s
258:	learn: 0.0029315	total: 27.9s	remaining: 2.15s
259:	learn: 0.0029315	total: 28s	remaining: 2.05s
260:	learn: 0.0029315	total: 28.2s	remaining: 1.94s
261:	learn: 0.0029315	total: 28.3s	remaining: 1.84s
262:	learn: 0.0029315	total: 28.5s	remaining: 1.73s
263:	learn: 0.0029315	total: 28.6s	remaining: 1.63s
264:	learn: 0.0029315	total: 28.7s	remaining: 1.52s
265:	learn: 0.0029315	total: 28.9s	remaining: 1.41s
266:	learn: 0.0029315	total: 29s	remaining: 1.3s
267:	learn: 0.0029315	total: 29.1s	remaining: 1.2s
268:	learn: 0.0029315	total: 29.3s	remaining: 1.09s
269:	learn: 0.0029315	total: 29.4s	remaining: 980ms
270:	learn: 0.0029315	total: 29.5s	remaining: 872ms
271:	learn: 0.0029315	total: 29.7s	remaining: 764ms
272:	learn: 0.0029315	total: 29.9s	remaining: 656ms
273:	learn: 0.0029315	total: 30s	remaining: 547ms
274:	learn: 0.0029315	total: 30.1s	remaining: 438ms
275:	learn: 0.0029315	total: 30.3s	remaining: 329ms
276:	learn: 0.0029315	total: 30.4s	remaining: 220ms
277:	learn: 0.0029315	total: 30.6s	remaining: 110ms
278:	learn: 0.0029315	total: 30.7s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.69
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.84
 - F1-Score_Train: 99.85
 - Precision_Test: 24.94
 - Recall_Test: 85.71
 - AUPRC_Test: 78.89
 - Accuracy_Test: 99.54
 - F1-Score_Test: 38.64
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 279
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.92
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (1, 133)

🔄 Fold 2: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5461770	total: 87.1ms	remaining: 24.2s
1:	learn: 0.4033238	total: 178ms	remaining: 24.6s
2:	learn: 0.3192476	total: 269ms	remaining: 24.7s
3:	learn: 0.2674701	total: 400ms	remaining: 27.5s
4:	learn: 0.2099489	total: 496ms	remaining: 27.2s
5:	learn: 0.1812987	total: 584ms	remaining: 26.6s
6:	learn: 0.1596767	total: 709ms	remaining: 27.5s
7:	learn: 0.1429312	total: 792ms	remaining: 26.8s
8:	learn: 0.1283402	total: 878ms	remaining: 26.3s
9:	learn: 0.1190589	total: 984ms	remaining: 26.5s
10:	learn: 0.1093645	total: 1.08s	remaining: 26.3s
11:	learn: 0.1002266	total: 1.18s	remaining: 26.3s
12:	learn: 0.0930623	total: 1.3s	remaining: 26.7s
13:	learn: 0.0862418	total: 1.41s	remaining: 26.7s
14:	learn: 0.0813392	total: 1.5s	remaining: 26.4s
15:	learn: 0.0754986	total: 1.66s	remaining: 27.3s
16:	learn: 0.0719569	total: 1.75s	remaining: 27s
17:	learn: 0.0684492	total: 1.84s	remaining: 26.8s
18:	learn: 0.0656136	total: 1.97s	remaining: 27s
19:	learn: 0.0625590	total: 2.07s	remaining: 26.9s
20:	learn: 0.0600285	total: 2.17s	remaining: 26.6s
21:	learn: 0.0572666	total: 2.29s	remaining: 26.8s
22:	learn: 0.0554004	total: 2.38s	remaining: 26.5s
23:	learn: 0.0536893	total: 2.49s	remaining: 26.5s
24:	learn: 0.0525078	total: 2.62s	remaining: 26.6s
25:	learn: 0.0502440	total: 2.71s	remaining: 26.3s
26:	learn: 0.0489032	total: 2.8s	remaining: 26.1s
27:	learn: 0.0463923	total: 2.92s	remaining: 26.2s
28:	learn: 0.0447393	total: 3.02s	remaining: 26s
29:	learn: 0.0429877	total: 3.11s	remaining: 25.8s
30:	learn: 0.0411479	total: 3.23s	remaining: 25.9s
31:	learn: 0.0400329	total: 3.32s	remaining: 25.6s
32:	learn: 0.0389326	total: 3.42s	remaining: 25.5s
33:	learn: 0.0379775	total: 3.55s	remaining: 25.6s
34:	learn: 0.0369074	total: 3.64s	remaining: 25.4s
35:	learn: 0.0363319	total: 3.73s	remaining: 25.2s
36:	learn: 0.0352803	total: 3.84s	remaining: 25.1s
37:	learn: 0.0340994	total: 3.94s	remaining: 25s
38:	learn: 0.0332182	total: 4.03s	remaining: 24.8s
39:	learn: 0.0323275	total: 4.15s	remaining: 24.8s
40:	learn: 0.0316238	total: 4.24s	remaining: 24.6s
41:	learn: 0.0304963	total: 4.33s	remaining: 24.5s
42:	learn: 0.0297552	total: 4.45s	remaining: 24.4s
43:	learn: 0.0288521	total: 4.56s	remaining: 24.4s
44:	learn: 0.0281018	total: 4.65s	remaining: 24.2s
45:	learn: 0.0274530	total: 4.77s	remaining: 24.2s
46:	learn: 0.0268668	total: 4.85s	remaining: 23.9s
47:	learn: 0.0263133	total: 4.98s	remaining: 24s
48:	learn: 0.0254845	total: 5.07s	remaining: 23.8s
49:	learn: 0.0248321	total: 5.16s	remaining: 23.6s
50:	learn: 0.0242583	total: 5.28s	remaining: 23.6s
51:	learn: 0.0239810	total: 5.36s	remaining: 23.4s
52:	learn: 0.0234018	total: 5.45s	remaining: 23.3s
53:	learn: 0.0228923	total: 5.58s	remaining: 23.3s
54:	learn: 0.0224310	total: 5.67s	remaining: 23.1s
55:	learn: 0.0220401	total: 5.76s	remaining: 22.9s
56:	learn: 0.0214746	total: 5.88s	remaining: 22.9s
57:	learn: 0.0211589	total: 5.98s	remaining: 22.8s
58:	learn: 0.0207699	total: 6.07s	remaining: 22.6s
59:	learn: 0.0204495	total: 6.18s	remaining: 22.6s
60:	learn: 0.0201463	total: 6.26s	remaining: 22.4s
61:	learn: 0.0198071	total: 6.35s	remaining: 22.2s
62:	learn: 0.0194167	total: 6.47s	remaining: 22.2s
63:	learn: 0.0189302	total: 6.56s	remaining: 22s
64:	learn: 0.0186055	total: 6.66s	remaining: 21.9s
65:	learn: 0.0184314	total: 6.82s	remaining: 22s
66:	learn: 0.0180381	total: 6.91s	remaining: 21.9s
67:	learn: 0.0178035	total: 7s	remaining: 21.7s
68:	learn: 0.0175931	total: 7.11s	remaining: 21.6s
69:	learn: 0.0172430	total: 7.21s	remaining: 21.5s
70:	learn: 0.0169201	total: 7.3s	remaining: 21.4s
71:	learn: 0.0166422	total: 7.42s	remaining: 21.3s
72:	learn: 0.0163729	total: 7.5s	remaining: 21.2s
73:	learn: 0.0161840	total: 7.6s	remaining: 21.1s
74:	learn: 0.0159515	total: 7.72s	remaining: 21s
75:	learn: 0.0156847	total: 7.81s	remaining: 20.9s
76:	learn: 0.0154680	total: 7.89s	remaining: 20.7s
77:	learn: 0.0152228	total: 8.01s	remaining: 20.6s
78:	learn: 0.0149215	total: 8.11s	remaining: 20.5s
79:	learn: 0.0145928	total: 8.19s	remaining: 20.4s
80:	learn: 0.0142972	total: 8.32s	remaining: 20.3s
81:	learn: 0.0140373	total: 8.4s	remaining: 20.2s
82:	learn: 0.0138858	total: 8.51s	remaining: 20.1s
83:	learn: 0.0136163	total: 8.69s	remaining: 20.2s
84:	learn: 0.0133449	total: 8.84s	remaining: 20.2s
85:	learn: 0.0131075	total: 9.02s	remaining: 20.2s
86:	learn: 0.0129918	total: 9.16s	remaining: 20.2s
87:	learn: 0.0127213	total: 9.32s	remaining: 20.2s
88:	learn: 0.0124480	total: 9.51s	remaining: 20.3s
89:	learn: 0.0122305	total: 9.68s	remaining: 20.3s
90:	learn: 0.0119927	total: 9.85s	remaining: 20.4s
91:	learn: 0.0118836	total: 10s	remaining: 20.4s
92:	learn: 0.0117571	total: 10.2s	remaining: 20.4s
93:	learn: 0.0115565	total: 10.4s	remaining: 20.4s
94:	learn: 0.0114333	total: 10.6s	remaining: 20.4s
95:	learn: 0.0113106	total: 10.7s	remaining: 20.5s
96:	learn: 0.0110411	total: 10.9s	remaining: 20.5s
97:	learn: 0.0109131	total: 11.1s	remaining: 20.5s
98:	learn: 0.0108328	total: 11.3s	remaining: 20.5s
99:	learn: 0.0107233	total: 11.4s	remaining: 20.5s
100:	learn: 0.0105402	total: 11.6s	remaining: 20.5s
101:	learn: 0.0104999	total: 11.8s	remaining: 20.5s
102:	learn: 0.0104154	total: 12s	remaining: 20.4s
103:	learn: 0.0102299	total: 12.1s	remaining: 20.4s
104:	learn: 0.0101118	total: 12.3s	remaining: 20.4s
105:	learn: 0.0100149	total: 12.5s	remaining: 20.4s
106:	learn: 0.0098886	total: 12.7s	remaining: 20.4s
107:	learn: 0.0097475	total: 12.8s	remaining: 20.3s
108:	learn: 0.0095968	total: 13s	remaining: 20.3s
109:	learn: 0.0094131	total: 13.2s	remaining: 20.3s
110:	learn: 0.0092377	total: 13.4s	remaining: 20.2s
111:	learn: 0.0091320	total: 13.5s	remaining: 20.2s
112:	learn: 0.0090265	total: 13.7s	remaining: 20.1s
113:	learn: 0.0089606	total: 13.9s	remaining: 20.1s
114:	learn: 0.0088437	total: 14s	remaining: 20s
115:	learn: 0.0086942	total: 14.2s	remaining: 20s
116:	learn: 0.0085938	total: 14.4s	remaining: 19.9s
117:	learn: 0.0085094	total: 14.6s	remaining: 19.9s
118:	learn: 0.0084263	total: 14.7s	remaining: 19.8s
119:	learn: 0.0083530	total: 14.9s	remaining: 19.7s
120:	learn: 0.0082782	total: 15s	remaining: 19.5s
121:	learn: 0.0081701	total: 15.1s	remaining: 19.4s
122:	learn: 0.0080664	total: 15.2s	remaining: 19.3s
123:	learn: 0.0079911	total: 15.3s	remaining: 19.1s
124:	learn: 0.0079108	total: 15.3s	remaining: 18.9s
125:	learn: 0.0078664	total: 15.4s	remaining: 18.8s
126:	learn: 0.0078070	total: 15.5s	remaining: 18.6s
127:	learn: 0.0077403	total: 15.6s	remaining: 18.4s
128:	learn: 0.0075995	total: 15.7s	remaining: 18.3s
129:	learn: 0.0075086	total: 15.8s	remaining: 18.1s
130:	learn: 0.0074074	total: 15.9s	remaining: 18s
131:	learn: 0.0073519	total: 16s	remaining: 17.9s
132:	learn: 0.0072828	total: 16.1s	remaining: 17.7s
133:	learn: 0.0071421	total: 16.2s	remaining: 17.6s
134:	learn: 0.0070591	total: 16.3s	remaining: 17.4s
135:	learn: 0.0070102	total: 16.4s	remaining: 17.3s
136:	learn: 0.0069550	total: 16.5s	remaining: 17.1s
137:	learn: 0.0068703	total: 16.6s	remaining: 17s
138:	learn: 0.0068268	total: 16.7s	remaining: 16.8s
139:	learn: 0.0067430	total: 16.8s	remaining: 16.7s
140:	learn: 0.0066816	total: 16.9s	remaining: 16.5s
141:	learn: 0.0066628	total: 17s	remaining: 16.4s
142:	learn: 0.0066156	total: 17.1s	remaining: 16.2s
143:	learn: 0.0064948	total: 17.2s	remaining: 16.1s
144:	learn: 0.0064433	total: 17.3s	remaining: 16s
145:	learn: 0.0064073	total: 17.4s	remaining: 15.8s
146:	learn: 0.0063615	total: 17.5s	remaining: 15.7s
147:	learn: 0.0062486	total: 17.6s	remaining: 15.6s
148:	learn: 0.0061960	total: 17.7s	remaining: 15.4s
149:	learn: 0.0061311	total: 17.8s	remaining: 15.3s
150:	learn: 0.0060711	total: 17.9s	remaining: 15.2s
151:	learn: 0.0059741	total: 18s	remaining: 15s
152:	learn: 0.0059331	total: 18.1s	remaining: 14.9s
153:	learn: 0.0059077	total: 18.2s	remaining: 14.8s
154:	learn: 0.0058313	total: 18.3s	remaining: 14.6s
155:	learn: 0.0057615	total: 18.4s	remaining: 14.5s
156:	learn: 0.0057375	total: 18.5s	remaining: 14.4s
157:	learn: 0.0057091	total: 18.6s	remaining: 14.2s
158:	learn: 0.0056590	total: 18.7s	remaining: 14.1s
159:	learn: 0.0056358	total: 18.8s	remaining: 14s
160:	learn: 0.0056083	total: 18.8s	remaining: 13.8s
161:	learn: 0.0055705	total: 19s	remaining: 13.7s
162:	learn: 0.0055539	total: 19s	remaining: 13.6s
163:	learn: 0.0055066	total: 19.1s	remaining: 13.4s
164:	learn: 0.0054382	total: 19.3s	remaining: 13.3s
165:	learn: 0.0053874	total: 19.3s	remaining: 13.2s
166:	learn: 0.0053303	total: 19.4s	remaining: 13s
167:	learn: 0.0052985	total: 19.6s	remaining: 12.9s
168:	learn: 0.0052230	total: 19.6s	remaining: 12.8s
169:	learn: 0.0051794	total: 19.7s	remaining: 12.6s
170:	learn: 0.0051266	total: 19.8s	remaining: 12.5s
171:	learn: 0.0051045	total: 19.9s	remaining: 12.4s
172:	learn: 0.0050495	total: 20s	remaining: 12.3s
173:	learn: 0.0050495	total: 20.1s	remaining: 12.2s
174:	learn: 0.0049949	total: 20.2s	remaining: 12s
175:	learn: 0.0049722	total: 20.3s	remaining: 11.9s
176:	learn: 0.0049226	total: 20.4s	remaining: 11.8s
177:	learn: 0.0049064	total: 20.5s	remaining: 11.6s
178:	learn: 0.0048578	total: 20.6s	remaining: 11.5s
179:	learn: 0.0048578	total: 20.7s	remaining: 11.4s
180:	learn: 0.0048247	total: 20.8s	remaining: 11.3s
181:	learn: 0.0047401	total: 20.9s	remaining: 11.1s
182:	learn: 0.0046870	total: 21s	remaining: 11s
183:	learn: 0.0046309	total: 21.1s	remaining: 10.9s
184:	learn: 0.0045695	total: 21.2s	remaining: 10.8s
185:	learn: 0.0045160	total: 21.3s	remaining: 10.7s
186:	learn: 0.0045160	total: 21.4s	remaining: 10.5s
187:	learn: 0.0044674	total: 21.5s	remaining: 10.4s
188:	learn: 0.0044508	total: 21.6s	remaining: 10.3s
189:	learn: 0.0044193	total: 21.7s	remaining: 10.1s
190:	learn: 0.0043877	total: 21.7s	remaining: 10s
191:	learn: 0.0043516	total: 21.9s	remaining: 9.91s
192:	learn: 0.0043322	total: 22s	remaining: 9.78s
193:	learn: 0.0042943	total: 22s	remaining: 9.65s
194:	learn: 0.0042599	total: 22.1s	remaining: 9.54s
195:	learn: 0.0042229	total: 22.3s	remaining: 9.42s
196:	learn: 0.0041981	total: 22.3s	remaining: 9.3s
197:	learn: 0.0041755	total: 22.5s	remaining: 9.19s
198:	learn: 0.0041566	total: 22.5s	remaining: 9.06s
199:	learn: 0.0041123	total: 22.6s	remaining: 8.94s
200:	learn: 0.0040877	total: 22.8s	remaining: 8.85s
201:	learn: 0.0040681	total: 22.9s	remaining: 8.72s
202:	learn: 0.0040293	total: 23s	remaining: 8.6s
203:	learn: 0.0039615	total: 23.1s	remaining: 8.49s
204:	learn: 0.0039078	total: 23.2s	remaining: 8.37s
205:	learn: 0.0038644	total: 23.3s	remaining: 8.26s
206:	learn: 0.0038457	total: 23.4s	remaining: 8.14s
207:	learn: 0.0038103	total: 23.5s	remaining: 8.02s
208:	learn: 0.0038103	total: 23.6s	remaining: 7.9s
209:	learn: 0.0037923	total: 23.7s	remaining: 7.78s
210:	learn: 0.0037752	total: 23.8s	remaining: 7.66s
211:	learn: 0.0037686	total: 23.9s	remaining: 7.55s
212:	learn: 0.0037622	total: 24s	remaining: 7.42s
213:	learn: 0.0037622	total: 24s	remaining: 7.29s
214:	learn: 0.0037622	total: 24.1s	remaining: 7.17s
215:	learn: 0.0037622	total: 24.2s	remaining: 7.05s
216:	learn: 0.0037292	total: 24.3s	remaining: 6.94s
217:	learn: 0.0037292	total: 24.4s	remaining: 6.82s
218:	learn: 0.0037104	total: 24.5s	remaining: 6.7s
219:	learn: 0.0036775	total: 24.6s	remaining: 6.59s
220:	learn: 0.0036631	total: 24.7s	remaining: 6.47s
221:	learn: 0.0036631	total: 24.7s	remaining: 6.35s
222:	learn: 0.0036443	total: 24.9s	remaining: 6.25s
223:	learn: 0.0036221	total: 25s	remaining: 6.14s
224:	learn: 0.0036084	total: 25.1s	remaining: 6.03s
225:	learn: 0.0035756	total: 25.3s	remaining: 5.94s
226:	learn: 0.0035454	total: 25.5s	remaining: 5.84s
227:	learn: 0.0035454	total: 25.6s	remaining: 5.73s
228:	learn: 0.0035254	total: 25.8s	remaining: 5.63s
229:	learn: 0.0034828	total: 26s	remaining: 5.53s
230:	learn: 0.0034483	total: 26.2s	remaining: 5.44s
231:	learn: 0.0034330	total: 26.4s	remaining: 5.34s
232:	learn: 0.0034198	total: 26.5s	remaining: 5.24s
233:	learn: 0.0034198	total: 26.7s	remaining: 5.13s
234:	learn: 0.0034198	total: 26.8s	remaining: 5.02s
235:	learn: 0.0034044	total: 27s	remaining: 4.92s
236:	learn: 0.0033871	total: 27.1s	remaining: 4.81s
237:	learn: 0.0033538	total: 27.3s	remaining: 4.7s
238:	learn: 0.0033317	total: 27.5s	remaining: 4.6s
239:	learn: 0.0033163	total: 27.7s	remaining: 4.49s
240:	learn: 0.0033049	total: 27.9s	remaining: 4.39s
241:	learn: 0.0032895	total: 28s	remaining: 4.28s
242:	learn: 0.0032621	total: 28.2s	remaining: 4.18s
243:	learn: 0.0032621	total: 28.4s	remaining: 4.07s
244:	learn: 0.0032431	total: 28.5s	remaining: 3.96s
245:	learn: 0.0032431	total: 28.7s	remaining: 3.85s
246:	learn: 0.0032431	total: 28.8s	remaining: 3.73s
247:	learn: 0.0032431	total: 29s	remaining: 3.62s
248:	learn: 0.0032431	total: 29.1s	remaining: 3.5s
249:	learn: 0.0032430	total: 29.2s	remaining: 3.39s
250:	learn: 0.0032431	total: 29.4s	remaining: 3.28s
251:	learn: 0.0032431	total: 29.6s	remaining: 3.17s
252:	learn: 0.0032430	total: 29.7s	remaining: 3.05s
253:	learn: 0.0032431	total: 29.8s	remaining: 2.94s
254:	learn: 0.0032431	total: 30s	remaining: 2.82s
255:	learn: 0.0032430	total: 30.1s	remaining: 2.71s
256:	learn: 0.0032430	total: 30.3s	remaining: 2.59s
257:	learn: 0.0032430	total: 30.4s	remaining: 2.48s
258:	learn: 0.0032430	total: 30.5s	remaining: 2.35s
259:	learn: 0.0032430	total: 30.6s	remaining: 2.23s
260:	learn: 0.0032430	total: 30.7s	remaining: 2.11s
261:	learn: 0.0032430	total: 30.8s	remaining: 2s
262:	learn: 0.0032430	total: 30.8s	remaining: 1.88s
263:	learn: 0.0032430	total: 30.9s	remaining: 1.76s
264:	learn: 0.0032429	total: 31s	remaining: 1.64s
265:	learn: 0.0032430	total: 31.1s	remaining: 1.52s
266:	learn: 0.0032430	total: 31.2s	remaining: 1.4s
267:	learn: 0.0032429	total: 31.3s	remaining: 1.28s
268:	learn: 0.0032429	total: 31.3s	remaining: 1.17s
269:	learn: 0.0032429	total: 31.4s	remaining: 1.05s
270:	learn: 0.0032429	total: 31.5s	remaining: 930ms
271:	learn: 0.0032429	total: 31.6s	remaining: 813ms
272:	learn: 0.0032429	total: 31.7s	remaining: 696ms
273:	learn: 0.0032429	total: 31.8s	remaining: 580ms
274:	learn: 0.0032429	total: 31.9s	remaining: 463ms
275:	learn: 0.0032429	total: 31.9s	remaining: 347ms
276:	learn: 0.0032428	total: 32s	remaining: 231ms
277:	learn: 0.0032428	total: 32.1s	remaining: 115ms
278:	learn: 0.0032428	total: 32.2s	remaining: 0us

✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.63
 - Recall_Train: 100.00
 - AUPRC_Train: 99.99
 - Accuracy_Train: 99.82
 - F1-Score_Train: 99.82
 - Precision_Test: 26.88
 - Recall_Test: 88.10
 - AUPRC_Test: 79.15
 - Accuracy_Test: 99.58
 - F1-Score_Test: 41.19
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 279
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.92
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (2, 133)

🔄 Fold 3: Optimización en progreso...
📊 Antes de SMOTE: Class
0   0.99831789
1   0.00168211
Name: proportion, dtype: float64
📈 Después de SMOTE: Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando CatBoost (Optuna con SMOTE)...
0:	learn: 0.5424722	total: 82.3ms	remaining: 22.9s
1:	learn: 0.4008156	total: 167ms	remaining: 23.1s
2:	learn: 0.3279095	total: 254ms	remaining: 23.4s
3:	learn: 0.2514136	total: 387ms	remaining: 26.6s
4:	learn: 0.2097178	total: 480ms	remaining: 26.3s
5:	learn: 0.1785789	total: 569ms	remaining: 25.9s
6:	learn: 0.1537856	total: 682ms	remaining: 26.5s
7:	learn: 0.1323579	total: 776ms	remaining: 26.3s
8:	learn: 0.1165231	total: 863ms	remaining: 25.9s
9:	learn: 0.1042901	total: 984ms	remaining: 26.5s
10:	learn: 0.0960306	total: 1.07s	remaining: 26.2s
11:	learn: 0.0888579	total: 1.16s	remaining: 25.8s
12:	learn: 0.0837823	total: 1.28s	remaining: 26.1s
13:	learn: 0.0779095	total: 1.38s	remaining: 26.1s
14:	learn: 0.0732543	total: 1.48s	remaining: 26.1s
15:	learn: 0.0692173	total: 1.6s	remaining: 26.3s
16:	learn: 0.0649262	total: 1.7s	remaining: 26.2s
17:	learn: 0.0612755	total: 1.82s	remaining: 26.4s
18:	learn: 0.0590442	total: 1.92s	remaining: 26.3s
19:	learn: 0.0565612	total: 2.03s	remaining: 26.3s
20:	learn: 0.0536262	total: 2.13s	remaining: 26.2s
21:	learn: 0.0509469	total: 2.25s	remaining: 26.3s
22:	learn: 0.0487689	total: 2.35s	remaining: 26.2s
23:	learn: 0.0462063	total: 2.49s	remaining: 26.5s
24:	learn: 0.0444879	total: 2.61s	remaining: 26.5s
25:	learn: 0.0432433	total: 2.7s	remaining: 26.2s
26:	learn: 0.0421175	total: 2.79s	remaining: 26s
27:	learn: 0.0411419	total: 2.9s	remaining: 26s
28:	learn: 0.0403031	total: 3s	remaining: 25.8s
29:	learn: 0.0390586	total: 3.08s	remaining: 25.6s
30:	learn: 0.0380304	total: 3.2s	remaining: 25.6s
31:	learn: 0.0366691	total: 3.28s	remaining: 25.4s
32:	learn: 0.0352045	total: 3.38s	remaining: 25.2s
33:	learn: 0.0341689	total: 3.52s	remaining: 25.4s
34:	learn: 0.0332763	total: 3.61s	remaining: 25.1s
35:	learn: 0.0324775	total: 3.7s	remaining: 25s
36:	learn: 0.0313697	total: 3.82s	remaining: 25s
37:	learn: 0.0304250	total: 3.92s	remaining: 24.8s
38:	learn: 0.0299267	total: 4s	remaining: 24.6s
39:	learn: 0.0291612	total: 4.11s	remaining: 24.6s
40:	learn: 0.0284404	total: 4.2s	remaining: 24.4s
41:	learn: 0.0280305	total: 4.28s	remaining: 24.1s
42:	learn: 0.0270326	total: 4.43s	remaining: 24.3s
43:	learn: 0.0262834	total: 4.55s	remaining: 24.3s
44:	learn: 0.0253639	total: 4.66s	remaining: 24.2s
45:	learn: 0.0245594	total: 4.76s	remaining: 24.1s
46:	learn: 0.0238971	total: 4.86s	remaining: 24s
47:	learn: 0.0233082	total: 4.98s	remaining: 24s
48:	learn: 0.0227854	total: 5.11s	remaining: 24s
49:	learn: 0.0222265	total: 5.22s	remaining: 23.9s
50:	learn: 0.0218261	total: 5.32s	remaining: 23.8s
51:	learn: 0.0213455	total: 5.41s	remaining: 23.6s
52:	learn: 0.0210334	total: 5.53s	remaining: 23.6s
53:	learn: 0.0205713	total: 5.63s	remaining: 23.4s
54:	learn: 0.0200397	total: 5.72s	remaining: 23.3s
55:	learn: 0.0195937	total: 5.85s	remaining: 23.3s
56:	learn: 0.0190109	total: 5.94s	remaining: 23.2s
57:	learn: 0.0187062	total: 6.09s	remaining: 23.2s
58:	learn: 0.0184940	total: 6.25s	remaining: 23.3s
59:	learn: 0.0181626	total: 6.42s	remaining: 23.4s
60:	learn: 0.0177380	total: 6.59s	remaining: 23.6s
61:	learn: 0.0174236	total: 6.77s	remaining: 23.7s
62:	learn: 0.0170152	total: 6.93s	remaining: 23.8s
63:	learn: 0.0165744	total: 7.11s	remaining: 23.9s
64:	learn: 0.0162782	total: 7.28s	remaining: 24s
65:	learn: 0.0160316	total: 7.45s	remaining: 24s
66:	learn: 0.0155998	total: 7.62s	remaining: 24.1s
67:	learn: 0.0154134	total: 7.79s	remaining: 24.2s
68:	learn: 0.0151687	total: 7.96s	remaining: 24.2s
69:	learn: 0.0148380	total: 8.14s	remaining: 24.3s
70:	learn: 0.0145447	total: 8.3s	remaining: 24.3s
71:	learn: 0.0143761	total: 8.52s	remaining: 24.5s
72:	learn: 0.0140374	total: 8.69s	remaining: 24.5s
73:	learn: 0.0138674	total: 8.87s	remaining: 24.6s
74:	learn: 0.0135481	total: 9.05s	remaining: 24.6s
75:	learn: 0.0132249	total: 9.21s	remaining: 24.6s
76:	learn: 0.0130415	total: 9.38s	remaining: 24.6s
77:	learn: 0.0128817	total: 9.55s	remaining: 24.6s
78:	learn: 0.0126075	total: 9.73s	remaining: 24.6s
79:	learn: 0.0124214	total: 9.91s	remaining: 24.7s
80:	learn: 0.0123046	total: 10.1s	remaining: 24.7s
81:	learn: 0.0121523	total: 10.3s	remaining: 24.7s
82:	learn: 0.0118928	total: 10.4s	remaining: 24.7s
83:	learn: 0.0117288	total: 10.6s	remaining: 24.7s
84:	learn: 0.0116561	total: 10.8s	remaining: 24.6s
85:	learn: 0.0115352	total: 11s	remaining: 24.7s
86:	learn: 0.0114334	total: 11.2s	remaining: 24.6s
87:	learn: 0.0112217	total: 11.3s	remaining: 24.6s
88:	learn: 0.0110375	total: 11.5s	remaining: 24.6s
89:	learn: 0.0108991	total: 11.7s	remaining: 24.5s
90:	learn: 0.0107252	total: 11.8s	remaining: 24.4s
91:	learn: 0.0104999	total: 11.9s	remaining: 24.3s
92:	learn: 0.0103573	total: 12s	remaining: 24s
93:	learn: 0.0101149	total: 12.1s	remaining: 23.8s
94:	learn: 0.0100326	total: 12.2s	remaining: 23.7s
95:	learn: 0.0099798	total: 12.3s	remaining: 23.5s
96:	learn: 0.0097745	total: 12.4s	remaining: 23.3s
97:	learn: 0.0096826	total: 12.5s	remaining: 23.1s
98:	learn: 0.0094531	total: 12.6s	remaining: 22.9s
99:	learn: 0.0093667	total: 12.7s	remaining: 22.7s
100:	learn: 0.0092882	total: 12.8s	remaining: 22.6s
101:	learn: 0.0091360	total: 12.9s	remaining: 22.4s
102:	learn: 0.0090566	total: 13s	remaining: 22.2s
103:	learn: 0.0088911	total: 13.1s	remaining: 22.1s
104:	learn: 0.0087863	total: 13.2s	remaining: 21.9s
105:	learn: 0.0086044	total: 13.3s	remaining: 21.7s
106:	learn: 0.0085231	total: 13.4s	remaining: 21.6s
107:	learn: 0.0083927	total: 13.5s	remaining: 21.4s
108:	learn: 0.0082045	total: 13.6s	remaining: 21.2s
109:	learn: 0.0081168	total: 13.7s	remaining: 21.1s
110:	learn: 0.0080184	total: 13.8s	remaining: 20.9s
111:	learn: 0.0079309	total: 13.9s	remaining: 20.7s
112:	learn: 0.0078642	total: 14s	remaining: 20.6s
113:	learn: 0.0077765	total: 14.1s	remaining: 20.4s
114:	learn: 0.0076871	total: 14.2s	remaining: 20.2s
115:	learn: 0.0076339	total: 14.3s	remaining: 20.1s
116:	learn: 0.0075117	total: 14.4s	remaining: 20s
117:	learn: 0.0074255	total: 14.5s	remaining: 19.8s
118:	learn: 0.0072617	total: 14.6s	remaining: 19.7s
119:	learn: 0.0072372	total: 14.7s	remaining: 19.5s
120:	learn: 0.0071397	total: 14.8s	remaining: 19.3s
121:	learn: 0.0070332	total: 14.9s	remaining: 19.2s
122:	learn: 0.0069415	total: 15s	remaining: 19.1s
123:	learn: 0.0068380	total: 15.1s	remaining: 18.9s
124:	learn: 0.0067518	total: 15.3s	remaining: 18.8s
125:	learn: 0.0066545	total: 15.4s	remaining: 18.7s
126:	learn: 0.0065744	total: 15.5s	remaining: 18.5s
127:	learn: 0.0064971	total: 15.6s	remaining: 18.4s
128:	learn: 0.0064240	total: 15.7s	remaining: 18.2s
129:	learn: 0.0063651	total: 15.7s	remaining: 18s
130:	learn: 0.0063161	total: 15.9s	remaining: 17.9s
131:	learn: 0.0062273	total: 16s	remaining: 17.8s
132:	learn: 0.0062089	total: 16.1s	remaining: 17.6s
133:	learn: 0.0061434	total: 16.2s	remaining: 17.5s
134:	learn: 0.0060684	total: 16.3s	remaining: 17.3s
135:	learn: 0.0059840	total: 16.4s	remaining: 17.2s
136:	learn: 0.0058922	total: 16.5s	remaining: 17.1s
137:	learn: 0.0057858	total: 16.6s	remaining: 16.9s
138:	learn: 0.0057042	total: 16.7s	remaining: 16.8s
139:	learn: 0.0056104	total: 16.8s	remaining: 16.7s
140:	learn: 0.0055383	total: 16.9s	remaining: 16.5s
141:	learn: 0.0054904	total: 17s	remaining: 16.4s
142:	learn: 0.0054192	total: 17.1s	remaining: 16.3s
143:	learn: 0.0053579	total: 17.2s	remaining: 16.1s
144:	learn: 0.0053319	total: 17.3s	remaining: 16s
145:	learn: 0.0053075	total: 17.4s	remaining: 15.8s
146:	learn: 0.0052174	total: 17.5s	remaining: 15.7s
147:	learn: 0.0051554	total: 17.6s	remaining: 15.5s
148:	learn: 0.0050972	total: 17.7s	remaining: 15.4s
149:	learn: 0.0050074	total: 17.8s	remaining: 15.3s
150:	learn: 0.0049455	total: 17.9s	remaining: 15.2s
151:	learn: 0.0048778	total: 18s	remaining: 15s
152:	learn: 0.0048417	total: 18.1s	remaining: 14.9s
153:	learn: 0.0048158	total: 18.2s	remaining: 14.8s
154:	learn: 0.0047761	total: 18.3s	remaining: 14.6s
155:	learn: 0.0047466	total: 18.4s	remaining: 14.5s
156:	learn: 0.0047060	total: 18.5s	remaining: 14.3s
157:	learn: 0.0046888	total: 18.6s	remaining: 14.2s
158:	learn: 0.0046284	total: 18.7s	remaining: 14.1s
159:	learn: 0.0045905	total: 18.7s	remaining: 13.9s
160:	learn: 0.0045597	total: 18.9s	remaining: 13.8s
161:	learn: 0.0045066	total: 19s	remaining: 13.7s
162:	learn: 0.0044674	total: 19s	remaining: 13.6s
163:	learn: 0.0044490	total: 19.2s	remaining: 13.4s
164:	learn: 0.0043931	total: 19.3s	remaining: 13.3s
165:	learn: 0.0043205	total: 19.4s	remaining: 13.2s
166:	learn: 0.0042604	total: 19.5s	remaining: 13.1s
167:	learn: 0.0042315	total: 19.6s	remaining: 12.9s
168:	learn: 0.0041946	total: 19.7s	remaining: 12.8s
169:	learn: 0.0041843	total: 19.8s	remaining: 12.7s
170:	learn: 0.0041349	total: 19.9s	remaining: 12.5s
171:	learn: 0.0041112	total: 19.9s	remaining: 12.4s
172:	learn: 0.0040856	total: 20.1s	remaining: 12.3s
173:	learn: 0.0040250	total: 20.1s	remaining: 12.2s
174:	learn: 0.0040072	total: 20.2s	remaining: 12s
175:	learn: 0.0039891	total: 20.4s	remaining: 11.9s
176:	learn: 0.0039637	total: 20.5s	remaining: 11.8s
177:	learn: 0.0039228	total: 20.6s	remaining: 11.7s
178:	learn: 0.0038895	total: 20.7s	remaining: 11.6s
179:	learn: 0.0038490	total: 20.8s	remaining: 11.4s
180:	learn: 0.0038327	total: 20.9s	remaining: 11.3s
181:	learn: 0.0037824	total: 21s	remaining: 11.2s
182:	learn: 0.0037608	total: 21.1s	remaining: 11.1s
183:	learn: 0.0037344	total: 21.2s	remaining: 10.9s
184:	learn: 0.0037344	total: 21.3s	remaining: 10.8s
185:	learn: 0.0036919	total: 21.4s	remaining: 10.7s
186:	learn: 0.0036550	total: 21.5s	remaining: 10.6s
187:	learn: 0.0036085	total: 21.6s	remaining: 10.4s
188:	learn: 0.0035686	total: 21.7s	remaining: 10.3s
189:	learn: 0.0035492	total: 21.8s	remaining: 10.2s
190:	learn: 0.0035179	total: 21.9s	remaining: 10.1s
191:	learn: 0.0034951	total: 22.1s	remaining: 10s
192:	learn: 0.0034665	total: 22.3s	remaining: 9.93s
193:	learn: 0.0034461	total: 22.4s	remaining: 9.83s
194:	learn: 0.0033998	total: 22.6s	remaining: 9.74s
195:	learn: 0.0033826	total: 22.8s	remaining: 9.64s
196:	learn: 0.0033826	total: 22.9s	remaining: 9.53s
197:	learn: 0.0033677	total: 23.1s	remaining: 9.44s
198:	learn: 0.0033677	total: 23.2s	remaining: 9.33s
199:	learn: 0.0033568	total: 23.4s	remaining: 9.24s
200:	learn: 0.0033182	total: 23.6s	remaining: 9.15s
201:	learn: 0.0032926	total: 23.7s	remaining: 9.05s
202:	learn: 0.0032559	total: 23.9s	remaining: 8.96s
203:	learn: 0.0032318	total: 24.1s	remaining: 8.86s
204:	learn: 0.0032318	total: 24.2s	remaining: 8.74s
205:	learn: 0.0032038	total: 24.4s	remaining: 8.65s
206:	learn: 0.0032038	total: 24.6s	remaining: 8.54s
207:	learn: 0.0031789	total: 24.7s	remaining: 8.44s
208:	learn: 0.0031789	total: 24.8s	remaining: 8.32s
209:	learn: 0.0031789	total: 25s	remaining: 8.21s
210:	learn: 0.0031789	total: 25.1s	remaining: 8.1s
211:	learn: 0.0031606	total: 25.3s	remaining: 8s
212:	learn: 0.0031456	total: 25.5s	remaining: 7.89s
213:	learn: 0.0031456	total: 25.6s	remaining: 7.78s
214:	learn: 0.0031456	total: 25.7s	remaining: 7.66s
215:	learn: 0.0031311	total: 25.9s	remaining: 7.56s
216:	learn: 0.0031311	total: 26.1s	remaining: 7.45s
217:	learn: 0.0031311	total: 26.2s	remaining: 7.34s
218:	learn: 0.0031311	total: 26.4s	remaining: 7.23s
219:	learn: 0.0031311	total: 26.5s	remaining: 7.12s
220:	learn: 0.0031216	total: 26.7s	remaining: 7.01s
221:	learn: 0.0031132	total: 26.9s	remaining: 6.91s
222:	learn: 0.0031132	total: 27s	remaining: 6.79s
223:	learn: 0.0031132	total: 27.2s	remaining: 6.68s
224:	learn: 0.0031132	total: 27.3s	remaining: 6.56s
225:	learn: 0.0030874	total: 27.5s	remaining: 6.45s
226:	learn: 0.0030874	total: 27.6s	remaining: 6.33s
227:	learn: 0.0030874	total: 27.8s	remaining: 6.21s
228:	learn: 0.0030874	total: 27.9s	remaining: 6.1s
229:	learn: 0.0030874	total: 28.1s	remaining: 5.98s
230:	learn: 0.0030874	total: 28.2s	remaining: 5.86s
231:	learn: 0.0030874	total: 28.4s	remaining: 5.75s
232:	learn: 0.0030874	total: 28.5s	remaining: 5.63s
233:	learn: 0.0030874	total: 28.7s	remaining: 5.51s
234:	learn: 0.0030874	total: 28.8s	remaining: 5.39s
235:	learn: 0.0030874	total: 28.9s	remaining: 5.27s
236:	learn: 0.0030874	total: 29.1s	remaining: 5.15s
237:	learn: 0.0030874	total: 29.2s	remaining: 5.04s
238:	learn: 0.0030873	total: 29.4s	remaining: 4.92s
239:	learn: 0.0030873	total: 29.6s	remaining: 4.8s
240:	learn: 0.0030873	total: 29.7s	remaining: 4.68s
241:	learn: 0.0030873	total: 29.9s	remaining: 4.57s
242:	learn: 0.0030873	total: 30s	remaining: 4.45s
243:	learn: 0.0030873	total: 30.2s	remaining: 4.33s
244:	learn: 0.0030873	total: 30.3s	remaining: 4.21s
245:	learn: 0.0030873	total: 30.4s	remaining: 4.08s
246:	learn: 0.0030873	total: 30.6s	remaining: 3.96s
247:	learn: 0.0030873	total: 30.7s	remaining: 3.84s
248:	learn: 0.0030873	total: 30.9s	remaining: 3.72s
249:	learn: 0.0030873	total: 31s	remaining: 3.6s
250:	learn: 0.0030872	total: 31.2s	remaining: 3.48s
251:	learn: 0.0030873	total: 31.3s	remaining: 3.35s
252:	learn: 0.0030873	total: 31.4s	remaining: 3.23s
253:	learn: 0.0030873	total: 31.6s	remaining: 3.11s
254:	learn: 0.0030873	total: 31.7s	remaining: 2.98s
255:	learn: 0.0030872	total: 31.9s	remaining: 2.86s
256:	learn: 0.0030872	total: 32s	remaining: 2.74s
257:	learn: 0.0030872	total: 32.2s	remaining: 2.62s
258:	learn: 0.0030872	total: 32.3s	remaining: 2.5s
259:	learn: 0.0030872	total: 32.5s	remaining: 2.37s
260:	learn: 0.0030872	total: 32.6s	remaining: 2.25s
261:	learn: 0.0030872	total: 32.7s	remaining: 2.12s
262:	learn: 0.0030872	total: 32.9s	remaining: 2s
263:	learn: 0.0030872	total: 33s	remaining: 1.88s
264:	learn: 0.0030872	total: 33.2s	remaining: 1.75s
265:	learn: 0.0030872	total: 33.3s	remaining: 1.63s
266:	learn: 0.0030872	total: 33.4s	remaining: 1.5s
267:	learn: 0.0030872	total: 33.6s	remaining: 1.38s
268:	learn: 0.0030872	total: 33.7s	remaining: 1.25s
269:	learn: 0.0030872	total: 33.8s	remaining: 1.13s
270:	learn: 0.0030872	total: 33.9s	remaining: 1s
271:	learn: 0.0030871	total: 34s	remaining: 874ms
272:	learn: 0.0030871	total: 34s	remaining: 748ms
273:	learn: 0.0030871	total: 34.1s	remaining: 623ms
274:	learn: 0.0030871	total: 34.2s	remaining: 498ms
275:	learn: 0.0030871	total: 34.3s	remaining: 373ms
276:	learn: 0.0030871	total: 34.4s	remaining: 248ms
277:	learn: 0.0030871	total: 34.5s	remaining: 124ms
278:	learn: 0.0030871	total: 34.5s	remaining: 0us
[I 2024-12-19 15:16:01,997] Trial 49 finished with value: 77.89523866432027 and parameters: {'learning_rate': 0.06490773016340588, 'max_depth': 6, 'n_estimators': 279, 'scale_pos_weight': 6.917567438408559}. Best is trial 37 with value: 80.67361029056296.
✅ Resultados para CatBoost (Optuna con SMOTE):
 - Modelo: CatBoost
 - Tecnica: Optuna con SMOTE
 - Sobreajuste: 1
 - Precision_Train: 99.66
 - Recall_Train: 100.00
 - AUPRC_Train: 99.98
 - Accuracy_Train: 99.83
 - F1-Score_Train: 99.83
 - Precision_Test: 25.96
 - Recall_Test: 85.71
 - AUPRC_Test: 75.65
 - Accuracy_Test: 99.56
 - F1-Score_Test: 39.85
 - model_shrink_mode: None
 - per_feature_ctr: None
 - eta: None
 - devices: None
 - verbose: None
 - per_object_feature_penalties: None
 - allow_const_label: None
 - mvs_reg: None
 - dev_score_calc_obj_block_size: None
 - ctr_leaf_count_limit: None
 - max_ctr_complexity: None
 - target_border: None
 - metric_period: None
 - depth: None
 - eval_fraction: None
 - allow_writing_files: None
 - save_snapshot: None
 - classes_count: None
 - ctr_description: None
 - leaf_estimation_method: None
 - one_hot_max_size: None
 - min_data_in_leaf: None
 - iterations: None
 - max_depth: 6
 - random_score_type: None
 - colsample_bylevel: None
 - bootstrap_type: None
 - n_estimators: 279
 - custom_metric: None
 - thread_count: None
 - bagging_temperature: None
 - random_strength: None
 - nan_mode: None
 - text_features: None
 - per_float_feature_quantization: None
 - simple_ctr: None
 - output_borders: None
 - use_best_model: None
 - gpu_cat_features_storage: None
 - combinations_ctr: None
 - border_count: None
 - feature_border_type: None
 - data_partition: None
 - fold_permutation_block: None
 - od_pval: None
 - name: None
 - early_stopping_rounds: None
 - tokenizers: None
 - best_model_min_trees: None
 - dev_efb_max_buckets: None
 - feature_weights: None
 - posterior_sampling: None
 - metadata: None
 - boosting_type: None
 - diffusion_temperature: None
 - gpu_ram_part: None
 - score_function: None
 - approx_on_full_history: None
 - sampling_unit: None
 - learning_rate: 0.06
 - task_type: None
 - snapshot_interval: None
 - rsm: None
 - store_all_simple_ctr: None
 - random_seed: None
 - sampling_frequency: None
 - ctr_target_border_count: None
 - final_ctr_computation_mode: None
 - fixed_binary_splits: None
 - subsample: None
 - auto_class_weights: None
 - ctr_history_unit: None
 - device_config: None
 - leaf_estimation_backtracking: None
 - l2_leaf_reg: None
 - has_time: None
 - fold_len_multiplier: None
 - pinned_memory_size: None
 - feature_calcers: None
 - model_shrink_rate: None
 - od_type: None
 - monotone_constraints: None
 - dictionaries: None
 - class_weights: None
 - max_bin: None
 - boost_from_average: None
 - grow_policy: None
 - embedding_features: None
 - langevin: None
 - callback: None
 - cat_features: None
 - train_dir: None
 - sparse_features_conflict_fraction: None
 - ignored_features: None
 - num_trees: None
 - penalties_coefficient: None
 - objective: None
 - used_ram_limit: None
 - text_processing: None
 - reg_lambda: None
 - snapshot_file: None
 - random_state: None
 - custom_loss: None
 - loss_function: None
 - leaf_estimation_iterations: None
 - silent: None
 - max_leaves: None
 - input_borders: None
 - counter_calc_method: None
 - num_boost_round: None
 - model_size_reg: None
 - eval_metric: None
 - num_leaves: None
 - min_child_samples: None
 - class_names: None
 - scale_pos_weight: 6.92
 - logging_level: None
 - first_feature_use_penalties: None
 - od_wait: None
✅ Tamaño del DataFrame actualizado: (3, 133)

🏆 Promedio de AUPRC en validación cruzada: 77.8952

🔍 Consolidando y ordenando resultados...

🏆 Resultados Finales Ordenados:
Empty DataFrame
Columns: [Modelo, Tecnica, Fold, Sobreajuste, Precision_Train, Recall_Train, AUPRC_Train, Accuracy_Train, F1-Score_Train, Precision_Test, Recall_Test, AUPRC_Test, Accuracy_Test, F1-Score_Test, learning_rate, max_depth, n_estimators, scale_pos_weight, iterations, class_weights]
Index: []

✅ Resultados guardados en 'resultados_optuna_con_smote.csv'

Consolidación de las métricas y los hiperparametros de los modelos¶

Para comparar de manera efectiva los resultados obtenidos en todas las técnicas aplicadas, consolidamos las métricas y los hiperparámetros en un único DataFrame llamado resultados_maestro.

Este proceso permite analizar el rendimiento de los modelos y las estrategias utilizadas (SMOTE, ADASYN, validación cruzada con y sin balanceo, GridSearchCV y Optuna).

In [ ]:
%%time

# Lista de DataFrames generados en las diferentes técnicas
dataframes_resultados = [
    resultados_validacion_cruzada_con_smote,  # Validación Cruzada con SMOTE
    resultados_validacion_cruzada_sin_smote,  # Validación Cruzada SIN SMOTE
    resultados_adasyn,                       # Validación Cruzada con ADASYN
    resultados_gridsearch_con_smote,         # GridSearchCV con SMOTE
    resultados_optuna                        # Optimización con Optuna
]

# Concatenar todos los resultados en un único DataFrame maestro
resultados_maestro = pd.concat(dataframes_resultados, ignore_index=True)

# Mostrar la cantidad total de resultados consolidados
print(f"✅ Consolidación completada: {len(resultados_maestro)} registros totales.")
print("\n🏆 Resultados Consolidados:")
print(resultados_maestro.head())

print(f"\n🚨 Verificación final:")
print(f"Validación Cruzada con SMOTE: {resultados_validacion_cruzada_con_smote.shape}")
print(f"Validación Cruzada SIN SMOTE: {resultados_validacion_cruzada_sin_smote.shape}")
print(f"Validación Cruzada con ADASYN: {resultados_adasyn.shape}")
print(f"GridSearchCV con SMOTE: {resultados_gridsearch_con_smote.shape}")
print(f"Optuna: {resultados_optuna.shape}")

# Guardar los resultados consolidados en un archivo CSV
output_file = "resultados_maestro_consolidados.csv"
resultados_maestro.to_csv(output_file, index=False)
print(f"\n✅ Resultados consolidados guardados en '{output_file}'")
✅ Consolidación completada: 62 registros totales.

🏆 Resultados Consolidados:
     Modelo                       Tecnica Sobreajuste  Precision_Train  \
0  CatBoost  Validación Cruzada con SMOTE           1      93.86611482   
1  CatBoost  Validación Cruzada con SMOTE           1      94.32083653   
2  CatBoost  Validación Cruzada con SMOTE           1      94.49939844   
3  CatBoost  Validación Cruzada con SMOTE           1      93.68708317   
4  CatBoost  Validación Cruzada con SMOTE           1      93.67646718   

   Recall_Train  AUPRC_Train  Accuracy_Train  F1-Score_Train  Precision_Test  \
0   99.96978792  99.79652302     96.71852248     96.82185249      2.47933884   
1   99.98365576  99.82593432     96.98176379     97.06972741      2.42047026   
2   99.97870296  99.82816188     97.07958159     97.16186263      2.42468773   
3   99.97771240  99.80735199     96.62045704     96.73023152      2.35668790   
4   99.96731152  99.80694122     96.60956088     96.71970501      2.30473752   

   Recall_Test  AUPRC_Test  Accuracy_Test  F1-Score_Test iterations  \
0  94.73684211 67.68355898    93.68992524     4.83221477        300   
1  92.10526316 60.34089979    93.70772517     4.71698113        300   
2  86.84210526 63.90756485    94.06817373     4.71765547        300   
3  97.36842105 71.12010920    93.17372731     4.60199005        300   
4  94.73684211 62.58593364    93.20042720     4.50000000        300   

   learning_rate depth class_weights verbose max_depth n_estimators  \
0     0.03000000     3       [1, 10]      50      None         None   
1     0.03000000     3       [1, 10]      50      None         None   
2     0.03000000     3       [1, 10]      50      None         None   
3     0.03000000     3       [1, 10]      50      None         None   
4     0.03000000     3       [1, 10]      50      None         None   

  scale_pos_weight min_child_weight gamma  l2_leaf_reg subsample Fold  \
0             None              NaN   NaN  15.00000000      None   10   
1             None              NaN   NaN  15.00000000      None   10   
2             None              NaN   NaN  15.00000000      None   10   
3             None              NaN   NaN  15.00000000      None   10   
4             None              NaN   NaN  15.00000000      None   10   

  model_shrink_mode per_feature_ctr   eta devices  \
0              None            None  None    None   
1              None            None  None    None   
2              None            None  None    None   
3              None            None  None    None   
4              None            None  None    None   

  per_object_feature_penalties allow_const_label mvs_reg  \
0                         None              None    None   
1                         None              None    None   
2                         None              None    None   
3                         None              None    None   
4                         None              None    None   

  dev_score_calc_obj_block_size ctr_leaf_count_limit max_ctr_complexity  \
0                          None                 None               None   
1                          None                 None               None   
2                          None                 None               None   
3                          None                 None               None   
4                          None                 None               None   

  target_border metric_period eval_fraction allow_writing_files save_snapshot  \
0          None          None          None                None          None   
1          None          None          None                None          None   
2          None          None          None                None          None   
3          None          None          None                None          None   
4          None          None          None                None          None   

  classes_count ctr_description leaf_estimation_method one_hot_max_size  \
0          None            None                   None             None   
1          None            None                   None             None   
2          None            None                   None             None   
3          None            None                   None             None   
4          None            None                   None             None   

  min_data_in_leaf random_score_type colsample_bylevel bootstrap_type  \
0             None              None              None           None   
1             None              None              None           None   
2             None              None              None           None   
3             None              None              None           None   
4             None              None              None           None   

  custom_metric thread_count bagging_temperature random_strength nan_mode  \
0          None         None                None            None     None   
1          None         None                None            None     None   
2          None         None                None            None     None   
3          None         None                None            None     None   
4          None         None                None            None     None   

  text_features per_float_feature_quantization simple_ctr output_borders  \
0          None                           None       None           None   
1          None                           None       None           None   
2          None                           None       None           None   
3          None                           None       None           None   
4          None                           None       None           None   

  use_best_model gpu_cat_features_storage combinations_ctr border_count  \
0           None                     None             None         None   
1           None                     None             None         None   
2           None                     None             None         None   
3           None                     None             None         None   
4           None                     None             None         None   

  feature_border_type data_partition fold_permutation_block od_pval  name  \
0                None           None                   None    None  None   
1                None           None                   None    None  None   
2                None           None                   None    None  None   
3                None           None                   None    None  None   
4                None           None                   None    None  None   

  early_stopping_rounds tokenizers best_model_min_trees dev_efb_max_buckets  \
0                  None       None                 None                None   
1                  None       None                 None                None   
2                  None       None                 None                None   
3                  None       None                 None                None   
4                  None       None                 None                None   

  feature_weights posterior_sampling metadata boosting_type  \
0            None               None     None          None   
1            None               None     None          None   
2            None               None     None          None   
3            None               None     None          None   
4            None               None     None          None   

  diffusion_temperature gpu_ram_part score_function approx_on_full_history  \
0                  None         None           None                   None   
1                  None         None           None                   None   
2                  None         None           None                   None   
3                  None         None           None                   None   
4                  None         None           None                   None   

  sampling_unit task_type snapshot_interval   rsm store_all_simple_ctr  \
0          None      None              None  None                 None   
1          None      None              None  None                 None   
2          None      None              None  None                 None   
3          None      None              None  None                 None   
4          None      None              None  None                 None   

  random_seed sampling_frequency ctr_target_border_count  \
0        None               None                    None   
1        None               None                    None   
2        None               None                    None   
3        None               None                    None   
4        None               None                    None   

  final_ctr_computation_mode fixed_binary_splits auto_class_weights  \
0                       None                None               None   
1                       None                None               None   
2                       None                None               None   
3                       None                None               None   
4                       None                None               None   

  ctr_history_unit device_config leaf_estimation_backtracking has_time  \
0             None          None                         None     None   
1             None          None                         None     None   
2             None          None                         None     None   
3             None          None                         None     None   
4             None          None                         None     None   

  fold_len_multiplier pinned_memory_size feature_calcers model_shrink_rate  \
0                None               None            None              None   
1                None               None            None              None   
2                None               None            None              None   
3                None               None            None              None   
4                None               None            None              None   

  od_type monotone_constraints dictionaries max_bin boost_from_average  \
0    None                 None         None    None               None   
1    None                 None         None    None               None   
2    None                 None         None    None               None   
3    None                 None         None    None               None   
4    None                 None         None    None               None   

  grow_policy embedding_features langevin callback cat_features train_dir  \
0        None               None     None     None         None      None   
1        None               None     None     None         None      None   
2        None               None     None     None         None      None   
3        None               None     None     None         None      None   
4        None               None     None     None         None      None   

  sparse_features_conflict_fraction ignored_features num_trees  \
0                              None             None      None   
1                              None             None      None   
2                              None             None      None   
3                              None             None      None   
4                              None             None      None   

  penalties_coefficient objective used_ram_limit text_processing reg_lambda  \
0                  None      None           None            None       None   
1                  None      None           None            None       None   
2                  None      None           None            None       None   
3                  None      None           None            None       None   
4                  None      None           None            None       None   

  snapshot_file random_state custom_loss loss_function  \
0          None         None        None          None   
1          None         None        None          None   
2          None         None        None          None   
3          None         None        None          None   
4          None         None        None          None   

  leaf_estimation_iterations silent max_leaves input_borders  \
0                       None   None       None          None   
1                       None   None       None          None   
2                       None   None       None          None   
3                       None   None       None          None   
4                       None   None       None          None   

  counter_calc_method num_boost_round model_size_reg eval_metric num_leaves  \
0                None            None           None        None       None   
1                None            None           None        None       None   
2                None            None           None        None       None   
3                None            None           None        None       None   
4                None            None           None        None       None   

  min_child_samples class_names logging_level first_feature_use_penalties  \
0              None        None          None                        None   
1              None        None          None                        None   
2              None        None          None                        None   
3              None        None          None                        None   
4              None        None          None                        None   

  od_wait kwargs  
0    None    NaN  
1    None    NaN  
2    None    NaN  
3    None    NaN  
4    None    NaN  

🚨 Verificación final:
Validación Cruzada con SMOTE: (20, 136)
Validación Cruzada SIN SMOTE: (20, 136)
Validación Cruzada con ADASYN: (20, 136)
GridSearchCV con SMOTE: (2, 134)
Optuna: (0, 20)

✅ Resultados consolidados guardados en 'resultados_maestro_consolidados.csv'
CPU times: user 90.4 ms, sys: 1.99 ms, total: 92.4 ms
Wall time: 94.8 ms

Seleccionar el Modelo Ganador¶

Entrenar y Guardar el Modelo Ganador en .pkl:

Identifica Modelo Ganador, extrae hiperarámetros dinámicamente y instancia, entrena y guarda el .pkl Automáticamente.

In [ ]:
resultados_Sobreajuste = resultados_maestro[resultados_maestro['Sobreajuste'] == 1]
resultados_Sobreajuste.shape
Out[ ]:
(52, 136)
In [ ]:
resultados_NoSobreajuste = resultados_maestro[resultados_maestro['Sobreajuste'] == 0]
resultados_NoSobreajuste.shape
Out[ ]:
(10, 136)
In [ ]:
# ============================
# Filtrar y Ordenar el Modelo Ganador
# ============================
from sklearn.model_selection import StratifiedKFold
from imblearn.over_sampling import SMOTE, ADASYN
import joblib
from inspect import signature

# Diccionario de clases de modelos
model_classes = {
    "CatBoost": CatBoostClassifier,
    "XGBoost": XGBClassifier
}

# Paso 1: Filtrar modelos sin sobreajuste
resultados_filtrados = resultados_maestro[resultados_maestro['Sobreajuste'] == 0]

# Verificar si hay modelos válidos después del filtro
if resultados_filtrados.empty:
    print("\n❌ No se encontraron modelos válidos después de aplicar los filtros de sobreajuste.")
else:
    # Paso 2: Filtrar modelos con Recall y AUPRC estrictamente menores a 99.99%
    resultados_filtrados = resultados_filtrados[
        (resultados_filtrados['Recall_Test'] < 99.99) & (resultados_filtrados['AUPRC_Test'] < 99.99)
    ]

    if resultados_filtrados.empty:
        print("\n❌ No se encontraron modelos después de aplicar filtros de métricas (Recall/AUPRC).")
    else:
        # Paso 3: Ordenar por Recall, AUPRC, Precision y F1-Score en orden descendente
        resultados_ordenados = resultados_filtrados.sort_values(
            by=['Recall_Test', 'AUPRC_Test', 'Precision_Test', 'F1-Score_Test'],
            ascending=[False, False, False, False]
        )

        # Seleccionar el modelo top 1
        mejor_modelo = resultados_ordenados.iloc[0]
        modelo_tipo = mejor_modelo['Modelo']
        tecnica_ganadora = mejor_modelo['Tecnica']

        # ============================
        # Extraer Hiperparámetros del Modelo Ganador
        # ============================
        hiperparametros_mejor_modelo = mejor_modelo.dropna()
        columnas_excluir = [
            'Modelo', 'Tecnica', 'Sobreajuste', 'Recall_Train', 'AUPRC_Train',
            'Precision_Train', 'F1-Score_Train', 'Accuracy_Train',
            'Recall_Test', 'AUPRC_Test', 'Precision_Test', 'F1-Score_Test', 'Accuracy_Test'
        ]
        hiperparametros_final = {
            col: val for col, val in hiperparametros_mejor_modelo.items()
            if col not in columnas_excluir and not isinstance(val, (list, np.ndarray))
        }
        parametros_validos = set(signature(model_classes[modelo_tipo]).parameters.keys())
        hiperparametros_final = {k: v for k, v in hiperparametros_final.items() if k in parametros_validos}

        # ============================
        # Imprimir Resumen del Modelo Ganador
        # ============================
        print("\n\033[1m🏆 **Modelo Ganador Seleccionado:**\033[0m")
        print(f"🔹 \033[1mModelo:\033[0m {modelo_tipo}")
        print(f"🔹 \033[1mTécnica Ganadora:\033[0m {tecnica_ganadora}")
        print(f"🔹 \033[1mRecall (Test):\033[0m {mejor_modelo['Recall_Test']:.2f}%")
        print(f"🔹 \033[1mAUPRC (Test):\033[0m {mejor_modelo['AUPRC_Test']:.2f}%")
        print(f"🔹 \033[1mPrecision (Test):\033[0m {mejor_modelo['Precision_Test']:.2f}%")
        print(f"🔹 \033[1mF1-Score (Test):\033[0m {mejor_modelo['F1-Score_Test']:.2f}%")
        print("\n\033[1m🔧 **Hiperparámetros del Modelo Ganador:**\033[0m")
        for param, value in hiperparametros_final.items():
            print(f"🔹 \033[1m{param}:\033[0m {value}")

        # ============================
        # Entrenamiento del Modelo Ganador
        # ============================
        print("\n🔧 Instanciando el modelo ganador...")
        modelo_ganador = model_classes[modelo_tipo](**hiperparametros_final)

        if "SMOTE" in tecnica_ganadora or "ADASYN" in tecnica_ganadora:
            print(f"\n🚀 Aplicando {tecnica_ganadora} al conjunto completo...")
            smote_adasyn = SMOTE(random_state=42) if "SMOTE" in tecnica_ganadora else ADASYN(random_state=42)
            X_res, y_res = smote_adasyn.fit_resample(X_train, y_train)
            print(f"📊 Distribución de clases después de {tecnica_ganadora}:")
            print(y_res.value_counts(normalize=True))
            print("\n🚀 Entrenando el modelo final...")
            modelo_ganador.fit(X_res, y_res)

        elif "SIN SMOTE" in tecnica_ganadora:
            print("\n🚀 Entrenando el modelo final SIN SMOTE...")
            modelo_ganador.fit(X_train, y_train)

        elif "GridSearchCV" in tecnica_ganadora or "Optuna" in tecnica_ganadora:
            print("\n🚀 Entrenando el modelo final con hiperparámetros optimizados...")
            modelo_ganador.fit(X_train, y_train)

        else:
            print("\n⚠️ Técnica desconocida. Entrenando con el conjunto completo...")
            modelo_ganador.fit(X_train, y_train)

        # ============================
        # Guardar el Modelo Ganador
        # ============================
        nombre_modelo_pkl = f"{modelo_tipo.lower()}_final.pkl"
        joblib.dump(modelo_ganador, nombre_modelo_pkl)
        print(f"\n✅ Modelo guardado como: {nombre_modelo_pkl}")
🏆 **Modelo Ganador Seleccionado:**
🔹 Modelo: XGBoost
🔹 Técnica Ganadora: Validación Cruzada SIN SMOTE
🔹 Recall (Test): 89.19%
🔹 AUPRC (Test): 95.20%
🔹 Precision (Test): 94.29%
🔹 F1-Score (Test): 91.67%

🔧 **Hiperparámetros del Modelo Ganador:**

🔧 Instanciando el modelo ganador...

🚀 Aplicando Validación Cruzada SIN SMOTE al conjunto completo...
📊 Distribución de clases después de Validación Cruzada SIN SMOTE:
Class
0   0.50000000
1   0.50000000
Name: proportion, dtype: float64

🚀 Entrenando el modelo final...

✅ Modelo guardado como: xgboost_final.pkl

Cargar el Modelo Más Reciente¶

Busca y carga el "_final.pkl" más reciente en la raiz del proyecto de forma automática.

In [ ]:
# ==========================
# Cargar el Modelo Más Reciente
# ==========================

# Buscar archivos .pkl y seleccionar el más reciente
pkl_files = [f for f in os.listdir() if f.endswith("_final.pkl")]

# Verificar si existen archivos pkl
if not pkl_files:
    raise FileNotFoundError("No se encontraron archivos .pkl en el directorio actual.")

# Ordenar archivos por fecha de modificación
pkl_files.sort(key=os.path.getmtime, reverse=True)
modelo_pkl = pkl_files[0]  # Seleccionar el archivo más reciente

# Cargar el modelo
print(f"🔍 Modelo más reciente cargado: {modelo_pkl}")
modelo_ganador = joblib.load(modelo_pkl)
🔍 Modelo más reciente cargado: xgboost_final.pkl

Realizar Predicciones¶

Realiza las predicciones en el conjunto de prueba (X_Test).

In [ ]:
# ==========================
# Realizar Predicciones
# ==========================
print("\n🚀 Realizando predicciones en el conjunto de prueba...")
y_pred = modelo_ganador.predict(X_test)
y_pred_proba = modelo_ganador.predict_proba(X_test)[:, 1]  # Probabilidades clase positiva

# Comparación de las primeras 20 predicciones
print("\n📊 Comparación de Predicciones:")
comparacion_df = pd.DataFrame({
    "Categorías Reales": y_test.values[:20],
    "Categorías Predichas": y_pred[:20]
})
print(comparacion_df)
🚀 Realizando predicciones en el conjunto de prueba...

📊 Comparación de Predicciones:
    Categorías Reales  Categorías Predichas
0                   0                     0
1                   0                     0
2                   0                     0
3                   0                     0
4                   0                     0
5                   0                     0
6                   0                     0
7                   0                     0
8                   0                     0
9                   0                     0
10                  0                     0
11                  0                     0
12                  0                     0
13                  0                     0
14                  0                     0
15                  0                     0
16                  0                     0
17                  0                     0
18                  0                     0
19                  0                     0

Informe de Clasificación y Métricas¶

Presenta el Informe de Clasificación y Métricas con el resultado de las predicciones.

Visualizaciones

Genera la Matriz de Confusión, la Curva Precision-Recall y la Curva-Roc para el conjunto de prueba (X_test)

In [ ]:
from sklearn.metrics import ConfusionMatrixDisplay, confusion_matrix
import matplotlib.pyplot as plt

# ============================
# Generar y Visualizar Matriz de Confusión
# ============================
print("\n📊 Visualizando la Matriz de Confusión...")

# Generar predicciones del modelo ganador
y_pred = modelo_ganador.predict(X_test)

# Calcular matriz de confusión
cm = confusion_matrix(y_test, y_pred)

# Visualizar la matriz de confusión
fig, ax = plt.subplots(figsize=(8, 6))
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=["Clase 0", "Clase 1"])
disp.plot(cmap='Blues', ax=ax)
ax.set_title(f"Matriz de Confusión - {modelo_tipo}")
plt.show()
No description has been provided for this image

Aplicación para probar el modelo con datos nuevos¶

Aplicación para probar el modelo con datos nuevos generados aleatoriamente.

  • Carga automáticamente el archivo .pkl más reciente encontrado en la carpeta. De esta manera, el modelo con fecha de última modificación más actualizada será seleccionado.
Generar datos simulados en un archivo CSV¶

Genera datos aleatorios consistentes con las características (features) de tu dataset original, como la variable Amount escalada y otras columnas que el modelo utiliza.

In [ ]:
# =======================================
# Generar datos simulados y exportar a CSV
# =======================================

from google.colab import files

# Configurar parámetros
n_muestras = 50  # Número de muestras aleatorias a generar
np.random.seed(42)  # Fijar la semilla para reproducibilidad

# Crear la estructura con las mismas columnas y orden del dataframe clean_data
columnas_originales = ['V1', 'V2', 'V3', 'V4', 'V5', 'V6', 'V7', 'V8', 'V9',
                       'V10', 'V11', 'V12', 'V13', 'V14', 'V15', 'V16', 'V17',
                       'V18', 'V19', 'V20', 'V21', 'V22', 'V23', 'V24', 'V25',
                       'V26', 'V27', 'V28', 'Amount']  # Ajusta según clean_data

# Generar valores aleatorios para cada columna
data_simulada = pd.DataFrame({
    col: np.random.uniform(-5, 5, n_muestras) if col != 'Amount' else np.random.uniform(0, 2000, n_muestras)
    for col in columnas_originales
})

# =============================
# Guardar los datos simulados
# =============================
output_csv = "datos_nuevos_simulados.csv"
data_simulada.to_csv(output_csv, index=False)
print(f"✅ Archivo generado y guardado como '{output_csv}'")

# =============================
# Visualizar los datos
# =============================
print("\n📊 Primeras filas de los datos generados:")
display(data_simulada.head(5))

# =============================
# Descargar el archivo CSV
# =============================
files.download(output_csv)  # Descargar automáticamente el archivo generado en tu ordenador
V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20 V21 V22 V23 V24 V25 V26 V27 V28 Amount
0 -1.25459881 4.69584628 -4.68570814 4.08265886 1.42031646 -2.05551108 -4.48318279 0.03136259 -3.96876131 2.77146916 1.98161714 0.20163501 -3.31064937 -4.13079712 0.32589433 -3.83101959 2.07238634 -4.32923523 -2.92113745 1.00516860 -3.14867071 3.74701673 -4.42157323 2.08362977 2.58263196 2.96537291 -4.68413855 3.67031496 334.08381580
1 4.50714306 2.75132823 1.36410411 -2.60438109 -4.15860035 -1.14902271 0.31354632 3.56489841 4.02552907 0.58404250 0.36096366 3.52181500 -2.21409661 0.37106542 -4.48176463 4.39832124 -3.47460957 2.49960470 -4.73467796 1.65036675 0.41900947 0.28937134 4.69102630 3.37013328 -4.75413084 -0.18477648 4.36212246 0.78675409 335.23843258
2 2.31993942 4.39498942 -1.85644019 -3.55105128 -3.38371286 3.51136672 0.40635122 1.58693632 0.05252372 -0.75777991 -1.90472384 0.51906839 -3.22989516 0.86841118 -1.63395722 1.27708053 0.76288360 -2.90094407 -3.18564565 -3.24628721 3.72945836 4.39067699 3.83785885 1.97471462 -4.77876448 -3.82691811 -4.48028716 -0.61384581 73.34285387
3 0.98658484 3.94827350 0.08570691 -0.10547240 3.98554189 -1.83077995 1.37429901 -3.37065573 3.26457466 4.06354385 3.13795020 0.60937972 -4.11297466 2.45439474 -3.65585323 -1.65094385 1.06715046 3.98054289 0.83041561 4.14411946 2.32224886 2.98783236 4.27752283 1.80140772 -1.76389781 -3.74814208 0.41296335 2.25257660 1472.80403013
4 -3.43981360 0.97899979 4.07566474 4.85650454 1.06429060 -3.30507253 2.26091334 -4.29431253 -1.79950399 -3.88802518 1.84731173 3.76653603 -3.79364129 -0.68340454 -4.36625030 -3.60727927 -0.75869329 -2.94860360 -0.78575449 -0.81229475 3.06561148 4.97934111 4.94907823 1.18611378 -0.11356810 1.85565287 2.09060519 -0.13331059 1327.60905524
Predicción interactiva : Importar CSV y predecir nuevos datos¶

Cargar el archivo CSV con datos nuevos generados en el paso anterior y generar predicciones usando tu modelo .pkl

  • El archivo .CSV debe estar descargado en una carpeta de su ordenador, normalmente se genera directamente en la carpeta de Descargas.
In [ ]:
# ============================
# 1. Configurar Columnas Esperadas
# ============================
columnas_originales = ['V1', 'V2', 'V3', 'V4', 'V5', 'V6', 'V7', 'V8', 'V9',
                       'V10', 'V11', 'V12', 'V13', 'V14', 'V15', 'V16', 'V17',
                       'V18', 'V19', 'V20', 'V21', 'V22', 'V23', 'V24', 'V25',
                       'V26', 'V27', 'V28', 'Amount']

# ============================
# 2. Subir el archivo CSV
# ============================
print("🔄 Suba su archivo CSV con los datos para predecir:")
uploaded = files.upload()

# ============================
# 3. Validar el archivo cargado y su estructura
# ============================
for filename in uploaded.keys():
    try:
        print(f"\n✅ Archivo cargado: {filename}")
        data_nueva = pd.read_csv(filename)

        # Validar si el archivo contiene todas las columnas esperadas
        columnas_faltantes = [col for col in columnas_originales if col not in data_nueva.columns]
        if columnas_faltantes:
            raise ValueError(f"❌ Faltan las siguientes columnas en el archivo: {columnas_faltantes}")

        # Reordenar las columnas para asegurar la consistencia
        data_nueva = data_nueva[columnas_originales]

        # Mostrar las primeras filas después de la validación
        print("\n📊 Primeras filas del archivo validado:")
        display(data_nueva.head(3))  # Mostrar 3 primeras líneas

        # ============================
        # 4. Cargar el modelo ganador
        # ============================
        try:
            modelos_pkl = [f for f in os.listdir() if f.endswith("_final.pkl")]
            if not modelos_pkl:  # Verificar si la lista está vacía
                raise FileNotFoundError("❌ No se encontró ningún archivo '_final.pkl' en el directorio.")

            modelo_pkl = modelos_pkl[0]  # Tomar el primer archivo encontrado
            print(f"\n🔍 Cargando el modelo: {modelo_pkl}")
            modelo_ganador = joblib.load(modelo_pkl)

        except FileNotFoundError as fnf_error:
            print(fnf_error)
            print("⚠️ Asegúrese de que el modelo esté guardado con el nombre correcto en el directorio.")
            raise

        # ============================
        # 5. Realizar predicciones
        # ============================
        print("\n🚀 Realizando predicciones con el modelo ganador...")
        predicciones = modelo_ganador.predict(data_nueva)

        # ============================
        # 6. Añadir predicciones al DataFrame
        # ============================
        data_nueva['Class'] = predicciones  # Añadir columna con predicciones

        # Mostrar resultados
        print("\n📈 Resultados de la predicción (con columna 'Class'):")
        display(data_nueva.head(3))  # Mostrar 3 primeras líneas

        # ============================
        # 7. Resumen de las predicciones
        # ============================
        num_fraude = (data_nueva['Class'] == 1).sum()
        num_no_fraude = (data_nueva['Class'] == 0).sum()

        print("\n🔍 Resumen de Predicciones:")
        print(f"🔸 Transacciones predichas como FRAUDE (1): {num_fraude}")
        print(f"🔸 Transacciones predichas como NO FRAUDE (0): {num_no_fraude}")

        # ============================
        # 8. Guardar y Descargar el archivo con predicciones
        # ============================
        output_filename = "predicciones_resultados.csv"
        data_nueva.to_csv(output_filename, index=False)
        print(f"\n✅ Archivo con predicciones guardado como '{output_filename}'")

        files.download(output_filename)  # Descargar archivo

    except Exception as e:
        print(f"❌ Error: {e}")
🔄 Suba su archivo CSV con los datos para predecir:
Upload widget is only available when the cell has been executed in the current browser session. Please rerun this cell to enable.
Saving datos_nuevos_simulados.csv to datos_nuevos_simulados (1).csv

✅ Archivo cargado: datos_nuevos_simulados (1).csv

📊 Primeras filas del archivo validado:
V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20 V21 V22 V23 V24 V25 V26 V27 V28 Amount
0 -1.25459881 4.69584628 -4.68570814 4.08265886 1.42031646 -2.05551108 -4.48318279 0.03136259 -3.96876131 2.77146916 1.98161714 0.20163501 -3.31064937 -4.13079712 0.32589433 -3.83101959 2.07238634 -4.32923523 -2.92113745 1.00516860 -3.14867071 3.74701673 -4.42157323 2.08362977 2.58263196 2.96537291 -4.68413855 3.67031496 334.08381580
1 4.50714306 2.75132823 1.36410411 -2.60438109 -4.15860035 -1.14902271 0.31354632 3.56489841 4.02552907 0.58404250 0.36096366 3.52181500 -2.21409661 0.37106542 -4.48176463 4.39832124 -3.47460957 2.49960470 -4.73467796 1.65036675 0.41900947 0.28937134 4.69102630 3.37013328 -4.75413084 -0.18477648 4.36212246 0.78675409 335.23843258
2 2.31993942 4.39498942 -1.85644019 -3.55105128 -3.38371286 3.51136672 0.40635122 1.58693632 0.05252372 -0.75777991 -1.90472384 0.51906839 -3.22989516 0.86841118 -1.63395722 1.27708053 0.76288360 -2.90094407 -3.18564565 -3.24628721 3.72945836 4.39067699 3.83785885 1.97471462 -4.77876448 -3.82691811 -4.48028716 -0.61384581 73.34285387
🔍 Cargando el modelo: xgboost_final.pkl

🚀 Realizando predicciones con el modelo ganador...

📈 Resultados de la predicción (con columna 'Class'):
V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V20 V21 V22 V23 V24 V25 V26 V27 V28 Amount Class
0 -1.25459881 4.69584628 -4.68570814 4.08265886 1.42031646 -2.05551108 -4.48318279 0.03136259 -3.96876131 2.77146916 1.98161714 0.20163501 -3.31064937 -4.13079712 0.32589433 -3.83101959 2.07238634 -4.32923523 -2.92113745 1.00516860 -3.14867071 3.74701673 -4.42157323 2.08362977 2.58263196 2.96537291 -4.68413855 3.67031496 334.08381580 1
1 4.50714306 2.75132823 1.36410411 -2.60438109 -4.15860035 -1.14902271 0.31354632 3.56489841 4.02552907 0.58404250 0.36096366 3.52181500 -2.21409661 0.37106542 -4.48176463 4.39832124 -3.47460957 2.49960470 -4.73467796 1.65036675 0.41900947 0.28937134 4.69102630 3.37013328 -4.75413084 -0.18477648 4.36212246 0.78675409 335.23843258 0
2 2.31993942 4.39498942 -1.85644019 -3.55105128 -3.38371286 3.51136672 0.40635122 1.58693632 0.05252372 -0.75777991 -1.90472384 0.51906839 -3.22989516 0.86841118 -1.63395722 1.27708053 0.76288360 -2.90094407 -3.18564565 -3.24628721 3.72945836 4.39067699 3.83785885 1.97471462 -4.77876448 -3.82691811 -4.48028716 -0.61384581 73.34285387 0
🔍 Resumen de Predicciones:
🔸 Transacciones predichas como FRAUDE (1): 31
🔸 Transacciones predichas como NO FRAUDE (0): 19

✅ Archivo con predicciones guardado como 'predicciones_resultados.csv'
Visualizaciones¶
In [ ]:
# Muestra la distribución de las transacciones fraudulentas con respecto de las
# no fraudulentas

#colors = ["blue", "red"]
colors = ["#87CEEB", "#FF6347"]

# Configurar la visualización en una fila
fig, axes = plt.subplots(1, figsize=(10, 6))

# Función para agregar etiquetas dentro de las barras
def add_labels(ax):
    for p in ax.patches:
        ax.annotate(f'{int(p.get_height())}', (p.get_x() + p.get_width() / 2,
                                               p.get_height()),
                    ha='center', va='bottom', fontsize=12, color='black',
                    weight='bold')

# Cuenta las veces que ocurre cada clase (0: no fraude, 1: fraude)
ax1 = sns.countplot(x='Class', data=data_nueva, palette=colors)
ax1.set_title('Transaciones fraudulentas vs no fraudulentas', fontsize=14,
              fontweight='bold')

ax1.set_xlabel("Tipo de Transacción")
ax1.set_ylabel("Cantidad de transacciones")
add_labels(ax1)

plt.xticks([0, 1], ['No-Fraude', 'Fraude'])
plt.show()
No description has been provided for this image

Conclusiones¶

📊 Conclusiones del Análisis de Modelos¶

Objetivo del Proyecto: Detectar transacciones fraudulentas en un conjunto de datos desbalanceado mediante modelos de Machine Learning y técnicas avanzadas de optimización.


🔎 Estrategias Aplicadas en el Proyecto¶

Se implementaron técnicas específicas para abordar los desafíos del desbalance y maximizar el rendimiento de los modelos XGBoost y CatBoost.

  1. Tratamiento del Desbalance de Datos: SMOTE

    • El dataset original presentaba un desbalance significativo, con muy pocos registros de fraude en comparación con transacciones no fraudulentas.
    • SMOTE (Synthetic Minority Over-sampling Technique):
      • Aplicado exclusivamente al conjunto de entrenamiento para generar ejemplos sintéticos de la clase minoritaria (fraude).
      • Esto permitió al modelo aprender patrones más robustos de la clase minoritaria sin eliminar información, como ocurriría con undersampling.
    • Razón para no usar el dataset original:
      Entrenar sin ninguna técnica de balanceo habría sesgado el modelo hacia la clase mayoritaria, resultando en un Recall bajo para los fraudes.
  2. Validación Cruzada (Stratified K-Fold):

    • Se aplicó una división en 5 pliegues para evaluar de manera robusta el rendimiento de los modelos, asegurando que la distribución de clases fuera equilibrada en cada pliegue.
    • Esta técnica ayudó a evitar que los resultados dependieran de una única división del dataset, obteniendo métricas más generalizables.
  3. Optimización de Hiperparámetros:

    • Se utilizaron dos técnicas avanzadas para afinar los hiperparámetros:
      • GridSearchCV: Exploró combinaciones predefinidas de hiperparámetros de manera exhaustiva.
      • Optuna: Automatizó la búsqueda mediante una optimización bayesiana eficiente, reduciendo tiempos de cómputo.

🛠️ Desafíos Abordados¶

  1. Desbalance del Dataset: Resuelto mediante SMOTE aplicado al conjunto de entrenamiento.
  2. Validación Generalizada: Garantizada mediante Validación Cruzada.
  3. Ajuste de Hiperparámetros: Optimizados mediante GridSearchCV y Optuna.
  4. Comparación de Algoritmos: Evaluación de XGBoost y CatBoost en métricas clave (Recall, Precision, AUPRC).

⚠️ Consideraciones sobre la Métrica Accuracy¶

Aunque la Accuracy del modelo ganador fue del 99.62%, esta métrica no es adecuada en datasets desbalanceados, como es el caso de fraudes.

  • Un modelo que predijera todo como "No Fraude" tendría una Accuracy alta, pero su capacidad de identificar fraudes sería nula.
  • Por esta razón, se priorizaron Recall y AUPRC, métricas que son más relevantes para evaluar el rendimiento en la clase minoritaria (fraude).

🚀 Próximos Pasos y Mejoras Futuras¶

Debido a limitaciones de tiempo, no se implementaron algunas estrategias adicionales que podrían mejorar aún más el rendimiento del modelo:

  1. Feature Engineering Avanzado: Crear características nuevas y más informativas que ayuden a detectar patrones complejos.
  2. Optimización del Umbral de Clasificación: Ajustar el umbral de decisión para equilibrar Recall y Precision.
  3. Despliegue del Modelo: Implementar el modelo en producción y realizar pruebas con datos en tiempo real.

Generar informe dinámico en formato PDF del modelo ganador¶

Este código crea un informe dinámico en formato PDF con contenido basado en tus datos y resultados.

In [ ]:
%%capture output
import pandas as pd
from weasyprint import HTML
from datetime import datetime
from IPython.display import display
import matplotlib.pyplot as plt
import seaborn as sns

# =============================
# 1. Extraer Datos Dinámicos
# =============================
# Supongamos que resultados_maestro es tu DataFrame final
best_model = resultados_maestro.loc[resultados_maestro["AUPRC_Test"].idxmax()]

# Extraer valores del modelo ganador
best_model_name = best_model["Modelo"]
best_technique = best_model["Tecnica"]
recall = best_model["Recall_Test"]
precision = best_model["Precision_Test"]
auprc = best_model["AUPRC_Test"]
f1_score = best_model["F1-Score_Test"]
accuracy = best_model["Accuracy_Test"]

# =============================
# 2. Generar Gráficos
# =============================
# Gráfico de comparación de modelos
plt.figure(figsize=(10, 6))
sns.barplot(x=resultados_maestro["Modelo"], y=resultados_maestro["AUPRC_Test"], palette="viridis")
plt.title("Comparación de Modelos: AUPRC")
plt.ylabel("AUPRC (%)")
plt.xlabel("Modelo")
plt.tight_layout()
plt.savefig("comparacion_modelos.png")  # Guardar gráfico
plt.close()

# =============================
# 3. Crear el Contenido del Informe
# =============================
html_content = f"""
<!DOCTYPE html>
<html lang="es">
<head>
    <meta charset="UTF-8">
    <title>Informe Final - Proyecto de Detección de Fraude</title>
    <style>
        body {{ font-family: Arial, sans-serif; margin: 20px; }}
        h1, h2 {{ color: #2E4053; }}
        table {{ border-collapse: collapse; width: 100%; }}
        th, td {{ border: 1px solid #ddd; padding: 8px; text-align: center; }}
        th {{ background-color: #4CAF50; color: white; }}
        img {{ display: block; margin: 20px auto; max-width: 80%; }}
        ul {{ margin: 10px; }}
        li {{ margin: 5px 0; }}
    </style>
</head>
<body>
    <h1>Informe Final - Proyecto de Detección de Fraude</h1>
    <p>Fecha de Generación: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}</p>

    <h2>🏆 Modelo Ganador: {best_model_name} ({best_technique})</h2>
    <table>
        <tr>
            <th>Métrica</th><th>Valor</th>
        </tr>
        <tr><td>Recall</td><td>{recall:.2f}%</td></tr>
        <tr><td>Precision</td><td>{precision:.2f}%</td></tr>
        <tr><td>AUPRC</td><td>{auprc:.2f}%</td></tr>
        <tr><td>F1-Score</td><td>{f1_score:.2f}%</td></tr>
        <tr><td>Accuracy</td><td>{accuracy:.2f}%</td></tr>
    </table>

    <h2>Comparación de Modelos</h2>
    <img src="comparacion_modelos.png" alt="Comparación de Modelos">

    <h2>Conclusión</h2>
    <p>
        El modelo <strong>{best_model_name}</strong> optimizado mediante
        <strong>{best_technique}</strong> es el más adecuado para la detección de fraudes
        debido a su equilibrio en las métricas clave.
    </p>

    <h2>Próximos Pasos</h2>
    <ul>
        <li>Optimización del umbral de clasificación.</li>
        <li>Despliegue del modelo en producción.</li>
        <li>Monitoreo continuo del rendimiento del modelo.</li>
    </ul>
</body>
</html>
"""

# =============================
# 4. Guardar el HTML y Convertirlo a PDF
# =============================
# Guardar el contenido en un archivo HTML
html_filename = "informe_final.html"
with open(html_filename, "w", encoding="utf-8") as f:
    f.write(html_content)

# Convertir el HTML a PDF
pdf_filename = "informe_final.pdf"
HTML(html_filename).write_pdf(pdf_filename)

# Mensaje de confirmación
print(f"✅ Informe PDF generado exitosamente: {pdf_filename}")

from google.colab import files
# Descargar automáticamente el archivo PDF
files.download(pdf_filename)

Exportar cuaderno a formato HTML y a formato PDF¶

Descarga el archivo generado directamente desde Colab¶

In [17]:
import json
import shutil
from google.colab import files

# Función para limpiar y validar el notebook
def fix_notebook_metadata(filepath):
    with open(filepath, 'r', encoding='utf-8') as file:
        notebook = json.load(file)

    for cell in notebook.get('cells', []):
        if 'outputs' in cell:
            for output in cell['outputs']:
                # Agregar 'metadata' si no existe
                if 'metadata' not in output:
                    output['metadata'] = {}

    # Guardar el notebook limpio
    with open(filepath, 'w', encoding='utf-8') as file:
        json.dump(notebook, file, indent=2, ensure_ascii=False)

# Rutas de archivos
notebook_path = "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb"
html_output_path = "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.html"
pdf_output_path_viahtml = "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim-pdfviahtml.pdf"

# Corregir el notebook
print("\nCorrigiendo y validando el notebook...")
fix_notebook_metadata(notebook_path)

# Exportar a HTML
print("\nExportando a formato HTML...")
!jupyter nbconvert --to html --output "$html_output_path" "$notebook_path"

# Exportar a PDF usando PDFviaHTML
print("\nExportando a formato PDF usando nbconvert con PDFviaHTML...")
!jupyter nbconvert --to pdfviahtml --output "$pdf_output_path_viahtml" "$notebook_path"

# Descargar archivos generados
print("\nDescargando los archivos generados...")
try:
    files.download(html_output_path)  # Descargar HTML
    files.download(pdf_output_path_viahtml)  # Descargar PDF generado con PDFviaHTML
except Exception as e:
    print(f"Error descargando los archivos: {e}")

print("\nExportación y descarga completadas.")
Limpiando el notebook...

Exportando a formato HTML...
[NbConvertApp] Converting notebook /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb to html
[NbConvertApp] ERROR | Notebook JSON is invalid: 'metadata' is a required property

Failed validating 'required' in display_data:

On instance['cells'][20]['outputs'][0]:
{'data': {'text/html': '<style>.container { width:100% !important; }</style>',
          'text/plain': '<IPython.core.display.HTML object>'},
 'output_type': 'display_data'}
[NbConvertApp] ERROR | Notebook is invalid after preprocessor <nbconvert.preprocessors.tagremove.TagRemovePreprocessor object at 0x7a0a9dfb5060>
Traceback (most recent call last):
  File "/usr/local/bin/jupyter-nbconvert", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/jupyter_core/application.py", line 283, in launch_instance
    super().launch_instance(argv=argv, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/traitlets/config/application.py", line 992, in launch_instance
    app.start()
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 420, in start
    self.convert_notebooks()
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 597, in convert_notebooks
    self.convert_single_notebook(notebook_filename)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 563, in convert_single_notebook
    output, resources = self.export_single_notebook(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 487, in export_single_notebook
    output, resources = self.exporter.from_filename(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 386, in from_filename
    return super().from_filename(filename, resources, **kw)  # type:ignore[return-value]
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 201, in from_filename
    return self.from_file(f, resources=resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 392, in from_file
    return super().from_file(file_stream, resources, **kw)  # type:ignore[return-value]
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 220, in from_file
    return self.from_notebook_node(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/html.py", line 268, in from_notebook_node
    html, resources = super().from_notebook_node(nb, resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 408, in from_notebook_node
    nb_copy, resources = super().from_notebook_node(nb, resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 154, in from_notebook_node
    nb_copy, resources = self._preprocess(nb_copy, resources)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 355, in _preprocess
    self._validate_preprocessor(nbc, preprocessor)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 322, in _validate_preprocessor
    nbformat.validate(nbc, relax_add_props=True)
  File "/usr/local/lib/python3.10/dist-packages/nbformat/validator.py", line 509, in validate
    raise error
nbformat.validator.NotebookValidationError: 'metadata' is a required property

Failed validating 'required' in display_data:

On instance['cells'][20]['outputs'][0]:
{'data': {'text/html': '<style>.container { width:100% !important; }</style>',
          'text/plain': '<IPython.core.display.HTML object>'},
 'output_type': 'display_data'}

Exportando a formato PDF usando nbconvert con PDFviaHTML...
[NbConvertApp] Converting notebook /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb to pdfviahtml
[NbConvertApp] ERROR | Notebook JSON is invalid: 'metadata' is a required property

Failed validating 'required' in display_data:

On instance['cells'][20]['outputs'][0]:
{'data': {'text/html': '<style>.container { width:100% !important; }</style>',
          'text/plain': '<IPython.core.display.HTML object>'},
 'output_type': 'display_data'}
[NbConvertApp] ERROR | Notebook is invalid after preprocessor <nbconvert.preprocessors.tagremove.TagRemovePreprocessor object at 0x7d4e28720220>
Traceback (most recent call last):
  File "/usr/local/bin/jupyter-nbconvert", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/jupyter_core/application.py", line 283, in launch_instance
    super().launch_instance(argv=argv, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/traitlets/config/application.py", line 992, in launch_instance
    app.start()
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 420, in start
    self.convert_notebooks()
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 597, in convert_notebooks
    self.convert_single_notebook(notebook_filename)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 563, in convert_single_notebook
    output, resources = self.export_single_notebook(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/nbconvertapp.py", line 487, in export_single_notebook
    output, resources = self.exporter.from_filename(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 386, in from_filename
    return super().from_filename(filename, resources, **kw)  # type:ignore[return-value]
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 201, in from_filename
    return self.from_file(f, resources=resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 392, in from_file
    return super().from_file(file_stream, resources, **kw)  # type:ignore[return-value]
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 220, in from_file
    return self.from_notebook_node(
  File "/usr/local/lib/python3.10/dist-packages/notebook_as_pdf/__init__.py", line 222, in from_notebook_node
    html_notebook, resources = html_exporter.from_notebook_node(
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/html.py", line 268, in from_notebook_node
    html, resources = super().from_notebook_node(nb, resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/templateexporter.py", line 408, in from_notebook_node
    nb_copy, resources = super().from_notebook_node(nb, resources, **kw)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 154, in from_notebook_node
    nb_copy, resources = self._preprocess(nb_copy, resources)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 355, in _preprocess
    self._validate_preprocessor(nbc, preprocessor)
  File "/usr/local/lib/python3.10/dist-packages/nbconvert/exporters/exporter.py", line 322, in _validate_preprocessor
    nbformat.validate(nbc, relax_add_props=True)
  File "/usr/local/lib/python3.10/dist-packages/nbformat/validator.py", line 509, in validate
    raise error
nbformat.validator.NotebookValidationError: 'metadata' is a required property

Failed validating 'required' in display_data:

On instance['cells'][20]['outputs'][0]:
{'data': {'text/html': '<style>.container { width:100% !important; }</style>',
          'text/plain': '<IPython.core.display.HTML object>'},
 'output_type': 'display_data'}

Descargando los archivos generados...
Exportación y descarga completadas.
In [16]:
# Rutas del archivo del cuaderno y salidas generadas
notebook_path = \
      "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb"

html_output_path = \
      "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.html"

pdf_output_path_viahtml = \
      "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim-pdfviahtml.pdf"
#pdf_output_path_pandoc = \
#      "/content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim-pandoc.pdf"

# Exportar el cuaderno a formatos HTML y PDF
print("\nExportando a formato HTML...")
!jupyter nbconvert --to html "$notebook_path" --output "$html_output_path"

print("\nExportando a formato PDF usando nbconvert con PDFviaHTML...")
!jupyter nbconvert --to pdfviahtml "$notebook_path" \
                    --output "$pdf_output_path_viahtml"

#print("\nExportando a formato PDF usando Pandoc...")
#!jupyter nbconvert --to pdf "$notebook_path" --output "$pdf_output_path_pandoc"

# Descargar los archivos generados
print("\nDescargando los archivos generados...")
try:
    files.download(html_output_path)  # Descargar HTML
    files.download(pdf_output_path_viahtml)  # Descargar PDF generado con PDFviaHTML
#    files.download(pdf_output_path_pandoc)  # Descargar PDF generado con Pandoc
except Exception as e:
    print(f"Error descargando los archivos: {e}")

print("\nExportación y descarga completadas.")
Exportando a formato HTML...
[NbConvertApp] Converting notebook /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb to html
[NbConvertApp] ERROR | Notebook JSON is invalid: Additional properties are not allowed ('metadata' was unexpected)

Failed validating 'additionalProperties' in stream:

On instance['cells'][163]['outputs'][0]:
{'metadata': {'tags': None},
 'name': 'stderr',
 'output_type': 'stream',
 'text': '[I 2024-12-19 13:22:34,041] A new study created in memory with '
         'n...'}
[NbConvertApp] WARNING | Alternative text is missing on 19 image(s).
[NbConvertApp] Writing 7095636 bytes to /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.html

Exportando a formato PDF usando nbconvert con PDFviaHTML...
[NbConvertApp] Converting notebook /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim.ipynb to pdfviahtml
[NbConvertApp] ERROR | Notebook JSON is invalid: Additional properties are not allowed ('metadata' was unexpected)

Failed validating 'additionalProperties' in stream:

On instance['cells'][163]['outputs'][0]:
{'metadata': {'tags': None},
 'name': 'stderr',
 'output_type': 'stream',
 'text': '[I 2024-12-19 13:22:34,041] A new study created in memory with '
         'n...'}
[NbConvertApp] WARNING | Alternative text is missing on 19 image(s).
[NbConvertApp] Writing 12371924 bytes to /content/drive/MyDrive/Colab Notebooks/Proyecto_Final_DeteccionFraudeTarjetaCredito_ErikaSamaraAlvaresAngelim-pdfviahtml.pdf

Descargando los archivos generados...
Exportación y descarga completadas.